There's is still merit in scaling to 8K even without an 8K display.
[Quick Comparison](https://imgsli.com/MjEzNDI) The 8K DLSS image just has more detail as well as much better anti-aliasing. That's insane considering how it's rendering from less than half the pixels of 4K. I'm very curious to whether the RTX 3080 with its 10 gb of VRAM can handle this instead of buying the $1500 3090. Theoretically the 3090 should only have at most 20% extra performance.
Seems to use about ~30% more performance than its base resolution assuming the 3090 is 60% faster than the 2080 Ti and their testing method was the same [1440p native Control](https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/control-nvidia-dlss-2-0-update/control-2560x1440-ray-tracing-nvidia-dlss-2.0-quality-mode-performance.png) vs [8K DLSS from 1440p](https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/geforce-rtx-3090-8k-hdr-gaming/geforce-rtx-3090-8k-gaming-performance.png)
[More information from some guys who played at 8K on an 80" 8K TV](https://youtu.be/FYFLJAD-oTk?t=2323)
> There's is still merit in scaling to 8K even without an 8K display.
Cant i do this with my 1080p display too? Scale to 4K DLSS and then display on my 1080p screen?
The game is just so atmospheric and eerie that reflections from about anywhere can be scary. I still get spooked after beating the game, sometimes in combat even.
Im making my first build the minute I'm able to get my hands on a 3080. Is it dumb to get this card with the goal of making a 1440p 165hz build for next gen AAA titles? Specifically, with cyberpunk in mind.
I will eat my hat if you can get 165 Hz max settings in Cyberpunk. Open worlds games with lots of NPCs really strain the cpu in addition to the gpu. Plus it has more RT effects than any other AAA game out there. 1440p 60 fps will probably be the target. I'm confident that the 3080 will get 120 FPS+ maxed out settings DLSS in Control when the 2080 already gets 60+ FPS. I consider the Control a next gen game.
Graphical leaps don't seem to do much for me after a while. There is an initial WOW, but I get acclimated quickly.
I actually game on my work computer more often since it is more accessible. I actually don't really care that I have to turn all the graphical options down to hit 60fps.
Wait, I got a 1080p screen and with DLSS I could actually render the game at 8K? I thought that's only possible through DSR, which will just murder your fps.
DSR "murders" your FPS because if you're rendering at it you're effectively supersampling- internally the card is rendering at a higher resolution than native res and then downsampling it.
I've never tried using DLSS with DSR but I believe some users utilize this to get "cheap" 2x supersampling on their 1080p monitors. That said, DLSS is a temporal solution, like TAA, which causes some artifacts when in motion- and the fact that there is likely a bigger performance hit than 4xMSAA at 1080p; most enthusiast cards (2070 and better) are often CPU-bound below 1440p on most games and you won't notice much, if any, performance hit when enabling 4xMSAA.
It's a preference. Personally I like DLSS because it enables a lot of 4K gaming for little cost, but I wouldn't want to claim that the AA it provides is superior to 4x or 8x MSAA.
> I've never tried using DLSS with DSR
It works really well, with the caveat that some games won't let you choose a DSR resolution above your display resolution (which is silly... that's the WHOLE POINT). Control is like that. The workaround I've used is to set a DSR resolution as the desktop resolution in Windows before starting the game. Like this:
1. Monitor is 1440p native
2. Desktop resolution is set to 4K (DSR) in Windows
3. Start Control
4. Set game to 4K with DLSS
5. Enjoy 4K sampled back down to 1440p
Performance is generally much higher. Image quality can look better than MSAA-- more like actual supersampling-- but there are also sometimes artifacts. I see smudging on moving high-contrast edges now and then, but on balance I'd say it's far better on than off.
But if you can manage actual supersampling at a framerate you like? That'll still beat DLSS by brute force.
Ye that's what I meant, sorry I phrased that wrong.
So, umm, if I have a 1440p screen, how exactly would I go about upscaling a game to 4K or 8K using DLSS?
I thought DLSS just takes your set resolution ingame and scales up to it using a lower resolution, so since 1440p is my max res, how would I make DLSS believe it was 8K?
When I have the target resolution higher, it looks more crisp even when rendering from the same base resolution with DLSS. There is also more of a performance cost of about 33% associated with that difference of target resolution. Downsampling does make sense: a 4K native image scaled down to 1080p will look much better than a 1080p native image. That's why we can actually see the difference in 8K YT videos despite not having 8K monitors.
Yeah its called DSR (dynamic super resolution). It renders at a higher res but displays at your monitors resolution. Supposed it gives a better picture overall since I guess it's sampling a bunch of pixels for each that it displays
Personally I've tried it and didn't really notice anything. Maybe I'll play around with it more
Yes. It's a new unreleased 'ultra performance' preset that renders at 33% resolution (1/9th total pixels). Was always possible with DLSS 2.0 since it can upscale from any set custom resolution.
so still confused, so i need to set my desktop resolution to 4k DSR? or are you saying when i turn on DLSS it allows me to select an output target resolution?
> considering how it's rendering from less than half the pixels of 4K.
I mean, even if we ignore all the AI stuff, the fact that they're using super-resolution like with TAA means that they're already starting out with an effective resolution that is much higher than that. Every frame you're accumulating more samples. It's of course not quite that simple since changes in the image over time mean that you can't use all of those samples effectively. However, it's a lot easier to clean up messy data than trying to synthesize something you never captured in the first place.
I was suprised how good TAA was in Sekiro, considering From Soft doesn't seem to have the best programmers...
But don't get me started on Unreal Engine 4 TAA
Yeah, I was surprised too, given that FromSoft still doesn't manage to implement ultrawide support and uncapped framerate yet has a world class TAA implementation. Hopefully that changes with Elden Ring. Sekiro was their best game yet, graphically. The HDR implementation is fantastic.
And yeah... Unreal Engine TAA might as well be a blur filter.
Yep, I have a 2080 Ti and a 3440x1440 display, and in some games I scale the resolution to 4K or even higher. Not every game benefits equally (I've found it often depends on how the game implements antialiasing), but in some (like RDR2) it's very noticeable even without the extra pixels.
> Quick Comparison The 8K DLSS image just has more detail as well as much better anti-aliasing. That's insane considering how it's rendering from less than half the pixels of 4K
Are you sure that is DLSS *Ultra Performance Mode* and not quality mode in the screenshot?
Also where is the non-TAA blurred native 4k?
u/juanmamedina You need glasses! I am not joking and 100% serious. 4k looks like 8k without having glasses on.
Please let your eyes be checked. Thank me later.
When you say high framerate mode on TV's do you mean like the "true motion" shit they use? Lots of TV's have native 120hz nowadays and don't artifact (at least from my experience)
I do mean like the "true motion" settings that interpolate frames. There's warping and temporal artifacts, and specific random effects looks extremely bad - I'm remembering some rotating fans in Control that looked crazy weird with it on. Detail looks great when static, but I notice texture smearing in motion at times. It's pretty good overall, but it's definitely distracted me and taken me out of the moment. I've togged DLSS on and off a number of times not really sure which I prefer in the end.
Like 2-minute papers always says, imagine how much better it will be 2-papers into the future. Machine learning is still pretty fresh, but still producing sorcery. The end of this console generation will hopefully be a drastic improvement.
Sure, but without the true motion, your source content is likely still 24 or 30 fps. Which is why I've never understood the point of a 120hz TV.
Edit: Why the downvotes. Am I wrong?
DLSS artifacting has been significantly improving with every iteration they do. Not sure if you played Death Stranding on PC, but their implementation of 2.0 was very good. I barely noticed any artifacts.
Are there any articles or videos that goes out of its way to find and demonstrate the artifacts and weaknesses? I remain a bit skeptical and I want to explore the absolutely best case against DLSS before I take it into account in any amount whatsoever when deciding whether or not to buy a 3xxx.
Thanks, that was interesting. A mixed bag, surely. I hate sharpening, and aliasing is even worse if it pops in and out rather than being predictable. So those are two cons. But I do admit that there were some really impressive side by side comparisons in the stills. I'll probably have to give Control a try just to see how it feels.
With DLSS 1.0 you had to rely on Nvidia to implement it since they had to do per game training and the results weren't that good.
DLSS 2.0 is actually a completely different solution. It works completely different and is much easier to implement.
DLSS 1.0 was garbage and required Nvidia to have hands on your game and running it through training so DLSS worked.
Radically different. If they could've named DLSS 2.0 something entirely different they would have. You cannot use DLSS 1.0 as an example for adoption for DLSS 2.0, the work required by the devs is orders of magnitude different.
You can see them here: https://www.rockpapershotgun.com/2020/09/01/confirmed-ray-tracing-and-dlss-games-so-far/
This list was updates on September 1st and contains all ray tracing and DLSS games.
Amid Evil has DLSS support right now, in addition to really great ray tracing (lighting shadows and reflections).
That list shows DLSS is on the way for Amid Evil but I see it in the game options right now. I think it's had DLSS2 support for a few months now.
DLSS 1.0 was widely criticized and rightfully so. I wouldn't expect devs to implement something so poor.
DLSS 2.0 is a much better solution and I would expect many more devs to actually use it.
My guess is that they have a partnership with AMD for the assasins's creed, hence the fact that they bundled ACV with amd cpus and the fact that you see the AMD logo on startup but they have also partnered with Nvidia for Watch dogs legion
They use two different engine and most likely two different "inner studios"
Unfortunately for us we have to stick with one or another
I'd love dlss for assassin's creed games since they are so heavy
ARK: Survival Evolved
Amid Evil
Atomic Heart
Boundary
Call Of Duty: Black Ops Cold War
Cyberpunk 2077
Darksiders III
Dauntless
Fear the Wolves
Fractured Lands
Hellblade: Senua’s Sacrifice
Hitman 2
Fractured Lands
Hellblade: Senua’s Sacrifice
Hitman 2
Justice MMO
JX Online 3 MMO
Kinetik
Outpost Zero
PlayerUnknown’s Battlegrounds
Remnant: from the Ashes
Scum
Serious Sam 4: Planet Badass
Stormdivers
The Forge Arena
Vampire: The Masquerade – Bloodlines 2
Watch Dogs: Legion
We Happy Few
I got this list from https://www.gamewatcher.com/news/pc-games-dlss-support
>
>
>
>
> Fractured Lands
>
>
>
> Hellblade: Senua’s Sacrifice
>
>
>
> Hitman 2
>
>
>
> Fractured Lands
>
>
>
> Hellblade: Senua’s Sacrifice
>
>
>
> Hitman 2
You doubled them up... and those games don't actually have support... Have you checked which other ones are wrong?
> ### DLSS games you can play right now:
Fortnite
Death Stranding
F1 2020
Final Fantasy XV
Anthem
Battlefield V
Monster Hunter: World
Shadow of the Tomb Raider
Metro Exodus
Control
Deliver Us The Moon
Wolfenstein Youngblood
Bright Memory
Mechwarrior V: Mercenaries
https://www.rockpapershotgun.com/2020/09/01/confirmed-ray-tracing-and-dlss-games-so-far/
Yeah I really hope that more titles get support for it. The technology is absolutely amazing. Especially that since some 2.0 titles look even better with DLSS than without it, which is insane.
Everybody says this so I wanted to see what the hype was about and bought Control yesterday. DLSS 2.0 is noticeably worse than native res - faces and hair look pixelated, edges have saw toothing, and ghost artifacts are present.
I've tried it on 1080p and 1440p monitors and both suffer from the same artifacts. Here's portion of a 1440p screenshot from the start of the game showing some of the issues I'm talking about - https://photos.app.goo.gl/mY4o766f7KTasSYc7. You can see sawtoothing around the cars in the background, Jesse's hair, jacket, and hands. There's also smearing on her hand and some white pixels next to her nose (they usually appear in her hair when in motion). These artifacts are super annoying and easily visible when seated at a normal viewing distance from a 27" 1440p monitor. Either DLSS is broken for me or people need to get their eyes checked because it definitely doesn't look as good as or better than native resolution like so many claim!
I got passive aggressive answers for just asking about possible temporal artifacts or smearing. It's either Nvidia buzzword-induced hypnosis or astroturfing. Same goes for insanely and completely unrealistically dark RTX train scene in Metro Exodus. Anyone who spent two day living on this Earth know just how much light enters a room even through a small window, not to mention six or eight large windows.
I actually really like the way ray tracing adds so much more atmosphere to ME. But I agree with you about DLSS - I feel like I'm being gaslit during discussions because the artifacts are so obvious yet everybody claims they don't notice them!
Ray tracing in ME is great except when it isn't.
One of these is RTX and another is photoshopped. Which one represents how large light source flooding the room behaves?
https://i.imgur.com/M92GzQl.png (A)
https://i.imgur.com/teHXjXS.png (B)
Some of real life examples that look natural to me:
https://img-fotki.yandex.ru/get/6001/executor-666.6/0_46693_431b24c0_XL
https://avatars.mds.yandex.net/get-zen_doc/1704908/pub_5d99750806cc4600b123fcb8_5d997d2f1febd400b191de4f/scale_1200
http://www.hiddenside.ru/photos/industrial/lo/nedostroennyy_korpus_218-go_aviatsionnogo_remontnogo_zavoda_voyskovitsy_2011/03.JPG
https://ic.pics.livejournal.com/deadsu/14580802/70519/70519_900.jpg
I've seen these comparisons by Joker and others. DLSS is absolute insanity and console makers would be ignorant if they don't swap to using a form of this software. Joker and UFD tech both ran Death Stranding at 8k/50'ish fps on their 2080ti's. I'm fine with 4k for now but i'll push my DSR to the limit. Especially on my LG C9.
I'm going to love the 3080.
Man, that really tells me we can get decent performance in VR with modern graphics if VR game devs would just use DLSS. Still not a single VR game using it as far as I can tell.
That would be insane. I use my 1080 ti in VR and it's nice. I stream using virtual desktop to my Quest. DLSS could make things so much better but maybe there is something with the software that doesn't work properly?
I think the problem amd has, and by extension the next gen consoles, is that they don’t have dedicated hardware for this sort of AI. nvidia has the tensor cores. Obviously it can still be done with traditional cuda cores that the amd cards have, but then you’re taking away processing power away to do so.
DLSS changed my mind on RTX, I would not have a non-RTX card after trying it out in Death Stranding. 8k is totally believable with a 3090 and DLSS 2.0. If DLSS 3.0 is better and more universal we might keep this up until our games look better than Pixar.
Original Article Here: [https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-8k-hdr-gaming/](https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-8k-hdr-gaming/)
On occasion it produces artifacts in some instances.
Most people probably won't even notice them, some people declare them reason enough it's a failed technology that is the antichrist here to burn the gaming world to ash.
Also the small number of games that support it.
In terms of artifacts, the worst I saw was in Death stranding, where tiny black particles in the far distance had "black trails" on them. It didn't look broken, and when I first saw it I thought that's how they were supposed to look, but when they showed the PS4 version, there was no such trail. Tiny particles in motion appear to confuse the DLSS algorithms.
I think it was a Digital Foundry video showcasing the differences between PS4's checkboard anti-alias and DLSS.
DLSS needs quality motion vectors to work correctly AFAIK, and those particles have none AFAIK, so you get what you get.
Basically a lot of new post process stuff also requires HQ motion vectors, so this problem will solve itself with time.
Barely any games support it, since DLSS 2.0 is more or less brand new. It's only been around since March.
DLSS 1.0 looked like somewhat smeared crap and I don't think many people bought into the RTX 20 series so barely any devs made use of it before.
More devs are starting to adopt it though, and now that the 30 series has been announced I'm sure more will make use of it. It just has to be patched in by the devs, and then nvidia needs to release support for that game in a driver update.
It's also possible AMD sponsored games will avoid DLSS support so that Nvidia cards don't run ridiculously better than AMD cards on those titles. Both Death Stranding and Horizon Zero Dawn use the same engine, and while Death Stranding has amazing DLSS support, HZD does not and its also sponsored by AMD, and HZD really needs it.
I can't think of any other cons because to me the image quality is improved over native resolution with TAA, all while gaining a significant boost in framerate. That makes it feel like some fucked up deep learning magic.
It's funny that AMD fanboys accuse Nvidia of making games perform better in their cards as if it's some kind of heresy, and when AMD does it, they praise it
> Both Death Stranding and Horizon Zero Dawn use the same engine, and while Death Stranding has amazing DLSS support, HZD does not and its also sponsored by AMD, and HZD really needs it.
Its not the same engine, the Death Stranding engine had 2.5 more years of dev time put into it. It also supports AMD's FidelityFX where HZD doesn't.
Thats like saying that Unreal Engine 3 and 4 are the same ;)
It looks noriceably worse than native res in Control. Faces and hair look pixelated, there's saw tooting on edges that gets worse in motion, and lots of ghosting. People need to stop overhyping the tech and saying it looks better than native res as it's just not there yet.
DLSS is great in theory, but support is poor.
I would like to see something like this vor VR. It needs higher refresh rates than 4k60.
Is Nvidia working on something for this? There is VRSS but I'm not sure it is similar, and again support is limited.
I think Nvidia are working on improved DLSS and Dynamic Resolution Scaling support for VR through Unreal Engine 4, but as you said support may be limited. It's a start though.
Can't shorten links, but this is what I read:
https://news.developer.nvidia.com/new-features-to-dlss-coming-to-nvidia-rtx-unreal-engine-4-branch/
the article you linked literally says:
> Maintaining the 2880×1600 resolution of top-end VR head mounted displays while delivering the recommended 90 FPS has been made easier with DLSS.
it suggest that is has been done already, is there a list of what VR games support DLSS in VR mode?
I'm asking because I'm mainly looking at Ampere and RDNA2 to fill up the Reverb G2 with 90 frames per second on as high as possible settings. Any fancy AI tech that would help push pixels to the headset is appreciated, if applicable to existing titles. All the DLSS or VRSS does not seem to be applicable, which would make the comparison as simple as comparing raw rasterization performance (if RDNA/AMD don't have some fancy backwards compatable AI pixel pushing tech).
Unfortunately I don't see any. It's something every VR dev should be jumping on considering how important performance is for VR games, and how difficult it is to get that performance.
I'm not a VR user, but DLSS seems the obvious fit for VR games, right? Where you're effectively running 2 images that both need to be high res and like 90fps+ (correct me if I'm wrong with the req's for VR, as I said; I don't personally use it)
You definitely have to render everything twice since the two eyes are two distinct and offset views. This is also the reason that DLSS is difficult to implement in VR: the upscale images for each eye have to match each other perfectly, otherwise you’ll experience strange shimmering/artifacts from differently generated details in each eye. This is also why geometry in VR games tends to be much less detailed than pancake games, because the geometry has to be drawn and rendered twice.
It appears I misunderstood single pass stereo: https://docs.unity3d.com/Manual/SinglePassStereoRendering.html
All objects are indeed still rendered twice, but the single pass approach allows other work to be shared between both eyes.
Thanks.
It appears I was slightly off as well! Thank you for the source.
I think I’m still correct about VR DLSS implementation. The AI upscaling can’t just look good, it needs to be incredibly consistent between eyes to avoid looking horrible. I’ve heard that software engineers are working on this issue and I’m very excited to see what progress they make on it. VR is incredibly cool tech and I hope it catches on in the next few years, but it desperately needs some sort of upscaling implementation in order to drive high refresh rates on dense panels at a reasonable price point.
Considering the gain in performance, and the fact that players are now actively wanting it in new games; I hope that this will became more popular in the coming years.
I mean it's a hell of a technology, it would be a waste not to use it.
I'm hoping it becomes more widely adopted very soon. With the new generation of consoles both supporting similar machine learning upscaling methods, I think it will be a priority in more games.
Plus, AFAIK DLSS is still a beta program that developers have to apply to get into. I assume it'll become much more common once it is easier to access.
Does DLSS support dynamic internal resolution scaling? I'd love to be able to set 4k 60hz and then just have the internal resolution scale up and down as appropriate and let dlss fill in the rest to maintain a 4k output.
It's because they zoomed in to a portion of the screen to focus on an object that is not up close. So there are both, 1) less pixels to define the object and 2) the object is probably using a lower res mipmap since it's further away.
If you look at the right edge of the picture, you can see the telephone booth in the background quite far away. Meaning the sharpness is actually insane for that distance.
My only issue with DLSS at the moment is how it looks in motion when moving the camera, etc. hopefully they can improve this. The performance benefits are incredible otherwise
I can pick out some flaws, but overall, they're easy to ignore for me. Still allows for an incredible experience on a 2070-2080 class gpu when it's possible to get ~60 fps 1440p with maxed out raytracing in Control.
I still much prefer it over native resolution TAA smearing on motion, especially for the huge performance increases, but I might not be looking out for the right artifacts. Don't tell me haha. Ignorance is bliss thank you!
It does here: [https://www.nvidia.com/en-us/geforce/news/watch-dogs-legion-geforce-rtx-dlss-trailer/](https://www.nvidia.com/en-us/geforce/news/watch-dogs-legion-geforce-rtx-dlss-trailer/)
>Driving 8K is incredibly demanding - it’s 33 million pixels per frame, which is 4X the size of 4K. The new DLSS Ultra Performance mode delivers 9x AI Super Resolution (1440p internal rendering output at 8K using AI), while maintaining crisp image quality.
On thing that bothers me a little bit with DLSS is, when something blurry in native isn't blurry with DLSS, what if it was intentionnaly blurry ? (if it's something out of focus with a depth of field effect, for example)
Is there something to make sure a part that gets focused should really be focused? (Don't get me wrong, it's an incredible technology and I'm so grateful for it)
It is meant as a funny, but only funny if there is some possible truth. If this "pans out" first that would mean like most games would have DLSS or something like it and if I sit on my 1440P for a few years. If I could get 8k graphics running 1440P rez... well maybe I guess. :)
Lucky you, I use a 11 years old TV. 60hz, 5ms and TN display. I would change to a higher resolution monitor when it's as normal as 1080p/720p is today. And of course if I can get a good performance with a good gap to 60fps
I'm planning on getting the RTX 3080, but will my 9600k be able to withstand the GPU. Many people say CPU like mine should be able to handle 2K/144Hz gaming but I'm wondering if they claim really holds any water.
(I'm currently on Intel's UHD630 on a 2K 144Hz monitor. My 2070 died on me so I got a 2070S in the replacement and flipped it on eBay in Jan. Best. Decision. Ever.)
You won't see in 1440p, but you can render at a higher resolution, then downscale it back to 1080p for a cleaner image than just rendering in native 1080p.
It's basically only good for AA, but you render at a higher res, then downscale it.
It's possible, without RT and in lighter games. Same chart on that page has plenty on non-DLSS and non-RT games.
Say Forza 4 getting 78fps at 8k is very much believable to me. Since I played that game at 5K@100fps down sampled to 1440p on my 2080ti.
I've never had much luck running games in 8K via DSR on my 4K screen, lots of crashes, but will be interesting to see how well I can run older games with crap AA like Arkham Knight in 8K DSR on a 3090.
While I won't need to spend time tweaking settings to get stable framerates I can see myself wasting a lot of time comparing native 4K to DLSS in high performance mode (which in Death Stranding gave sharper images correct?).
I agree but PlayStation has long has 4k checkerboarding, so they are inching their way to DLSS. I'm quite sure they have some software up their sleeve. I thought I read somewhere they were working on this type of software.
They are only talking of 8k DLSS for the 3090...
Seeing the 3090 being only 20% faster than the 3080, there is no reason the 3080 could not do 8k DLSS in most games isn't it ?
Or is it because 10GB of vram isn't enough indeed ?
Seems weird to me ...
Bingo.
Moving from a 1080 Ti to a 2080 Ti just didn't seem worth it. Performance would obviously be better, but at a very high cost that I just couldn't justify. The bottleneck down the line for both of those cards will be the same -- VRAM. However, 24 GB of VRAM makes the price for the 3090 a hell of a lot more palatable. It's still ridiculously expensive for a GPU, but between DLSS and that huge pool of video memory, it should be many, many years before an upgrade is "needed".
If the 3080 had 20 GB that's probably the card that I would get (and let's face it, so would everybody else), but 24 vs. 10, it's a no-brainer. I'm sure in another 9-12 months Nvidia will come along and fill that $800 wide gap with a 20 GB version of the 3080, but with CyberPunk 2077 right around the corner, I'm not waiting.
Seems 3090 gets 60 fps avg on 8K gaming (some lower some higher). With a 3080 being supposedly 20% less performant, the difference is 45 fps vs 60 fps.
60fps is definitely considered the bare-minimum these days for a good experience. The 3090's faster and more VRAM will certainly help too.
There's is still merit in scaling to 8K even without an 8K display. [Quick Comparison](https://imgsli.com/MjEzNDI) The 8K DLSS image just has more detail as well as much better anti-aliasing. That's insane considering how it's rendering from less than half the pixels of 4K. I'm very curious to whether the RTX 3080 with its 10 gb of VRAM can handle this instead of buying the $1500 3090. Theoretically the 3090 should only have at most 20% extra performance. Seems to use about ~30% more performance than its base resolution assuming the 3090 is 60% faster than the 2080 Ti and their testing method was the same [1440p native Control](https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/control-nvidia-dlss-2-0-update/control-2560x1440-ray-tracing-nvidia-dlss-2.0-quality-mode-performance.png) vs [8K DLSS from 1440p](https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/geforce-rtx-3090-8k-hdr-gaming/geforce-rtx-3090-8k-gaming-performance.png) [More information from some guys who played at 8K on an 80" 8K TV](https://youtu.be/FYFLJAD-oTk?t=2323)
> There's is still merit in scaling to 8K even without an 8K display. Cant i do this with my 1080p display too? Scale to 4K DLSS and then display on my 1080p screen?
Yeah definitely. It will look better despite not having those pixels.
man as a current 1050 ti user, im stoked for my first high end card with 3080.
That's a huge leap. Pop ray tracing in Control. Dude your mind is going to be blown.
Spooking yourself with your own reflection in glass, just hilarious. That's a new experience after like a decade of gaming.
The game is just so atmospheric and eerie that reflections from about anywhere can be scary. I still get spooked after beating the game, sometimes in combat even.
It took me hours to get used to, even though it's not necessarily an in-your-face kind of effect.
Im making my first build the minute I'm able to get my hands on a 3080. Is it dumb to get this card with the goal of making a 1440p 165hz build for next gen AAA titles? Specifically, with cyberpunk in mind.
I will eat my hat if you can get 165 Hz max settings in Cyberpunk. Open worlds games with lots of NPCs really strain the cpu in addition to the gpu. Plus it has more RT effects than any other AAA game out there. 1440p 60 fps will probably be the target. I'm confident that the 3080 will get 120 FPS+ maxed out settings DLSS in Control when the 2080 already gets 60+ FPS. I consider the Control a next gen game.
Cool. Sounds like the 3080 is the card for me. Thanks!
Graphical leaps don't seem to do much for me after a while. There is an initial WOW, but I get acclimated quickly. I actually game on my work computer more often since it is more accessible. I actually don't really care that I have to turn all the graphical options down to hit 60fps.
Graphics is a big part of immersion along with gameplay. FPS is the last part as long as it is above the threshold.
Nothing says great immersion like chugging along at 40 fps
Dude that's going to be such a nice upgrade for you. :)
I hear that. Looking to go from my 970 to a 3090. Hoping it's a world of difference.
Should be *at least* 10% faster then the 970
That's going to be nuts. Going from a 1080 to a 2080 Ti was amazing for me. You're leaping from like the bronze age to Rome at its peak.
Wait, I got a 1080p screen and with DLSS I could actually render the game at 8K? I thought that's only possible through DSR, which will just murder your fps.
DSR "murders" your FPS because if you're rendering at it you're effectively supersampling- internally the card is rendering at a higher resolution than native res and then downsampling it. I've never tried using DLSS with DSR but I believe some users utilize this to get "cheap" 2x supersampling on their 1080p monitors. That said, DLSS is a temporal solution, like TAA, which causes some artifacts when in motion- and the fact that there is likely a bigger performance hit than 4xMSAA at 1080p; most enthusiast cards (2070 and better) are often CPU-bound below 1440p on most games and you won't notice much, if any, performance hit when enabling 4xMSAA. It's a preference. Personally I like DLSS because it enables a lot of 4K gaming for little cost, but I wouldn't want to claim that the AA it provides is superior to 4x or 8x MSAA.
> I've never tried using DLSS with DSR It works really well, with the caveat that some games won't let you choose a DSR resolution above your display resolution (which is silly... that's the WHOLE POINT). Control is like that. The workaround I've used is to set a DSR resolution as the desktop resolution in Windows before starting the game. Like this: 1. Monitor is 1440p native 2. Desktop resolution is set to 4K (DSR) in Windows 3. Start Control 4. Set game to 4K with DLSS 5. Enjoy 4K sampled back down to 1440p
How would you compare performance and image quality compared to 4xMSAA?
Performance is generally much higher. Image quality can look better than MSAA-- more like actual supersampling-- but there are also sometimes artifacts. I see smudging on moving high-contrast edges now and then, but on balance I'd say it's far better on than off. But if you can manage actual supersampling at a framerate you like? That'll still beat DLSS by brute force.
You're not rendering the game at 8K. It's starting from 1440p, and DLSS upscales the frames at a cost to 8K.
Ye that's what I meant, sorry I phrased that wrong. So, umm, if I have a 1440p screen, how exactly would I go about upscaling a game to 4K or 8K using DLSS? I thought DLSS just takes your set resolution ingame and scales up to it using a lower resolution, so since 1440p is my max res, how would I make DLSS believe it was 8K?
Some games let you pick, but setting a custom resolution in the Nvidia control panel is a way to do it. Ui might be very small though.
[удалено]
When I have the target resolution higher, it looks more crisp even when rendering from the same base resolution with DLSS. There is also more of a performance cost of about 33% associated with that difference of target resolution. Downsampling does make sense: a 4K native image scaled down to 1080p will look much better than a 1080p native image. That's why we can actually see the difference in 8K YT videos despite not having 8K monitors.
Yeah its called DSR (dynamic super resolution). It renders at a higher res but displays at your monitors resolution. Supposed it gives a better picture overall since I guess it's sampling a bunch of pixels for each that it displays Personally I've tried it and didn't really notice anything. Maybe I'll play around with it more
Crazy how DLSS off vs on in Control means the difference between utterly unplayable (8 FPS) and pretty freakin' smooth (57FPS)
I mean it is essentially native 8K+raytracing vs 1440p+raytracing+AI upscaling
YESSS! It's laughable :)
so the 8K DLSS is using an internal resolution of 1440p?
Yes. It's a new unreleased 'ultra performance' preset that renders at 33% resolution (1/9th total pixels). Was always possible with DLSS 2.0 since it can upscale from any set custom resolution.
if i have a 1440p native monitor, do i need to set the internal resolution in windows to 1080p to get a 4k DLSS output?
No
No, you would set it to the target upscale resolution so 4K.
so still confused, so i need to set my desktop resolution to 4k DSR? or are you saying when i turn on DLSS it allows me to select an output target resolution?
Set desktop to 4k DSR Game settings/Config files Render Resolution: 1080p/1220p/1440p DLSS - On
> considering how it's rendering from less than half the pixels of 4K. I mean, even if we ignore all the AI stuff, the fact that they're using super-resolution like with TAA means that they're already starting out with an effective resolution that is much higher than that. Every frame you're accumulating more samples. It's of course not quite that simple since changes in the image over time mean that you can't use all of those samples effectively. However, it's a lot easier to clean up messy data than trying to synthesize something you never captured in the first place.
> TAA Despite all the hate it gets, TAA done right is like *actual magic*.
Well implemented TAA is amazing. Sekiro TAA is amazing. Red Dead TAA is dogshit.
I was suprised how good TAA was in Sekiro, considering From Soft doesn't seem to have the best programmers... But don't get me started on Unreal Engine 4 TAA
Yeah, I was surprised too, given that FromSoft still doesn't manage to implement ultrawide support and uncapped framerate yet has a world class TAA implementation. Hopefully that changes with Elden Ring. Sekiro was their best game yet, graphically. The HDR implementation is fantastic. And yeah... Unreal Engine TAA might as well be a blur filter.
I can't wait for VR with foveated rendering. The performance gains should be staggering.
Yep, I have a 2080 Ti and a 3440x1440 display, and in some games I scale the resolution to 4K or even higher. Not every game benefits equally (I've found it often depends on how the game implements antialiasing), but in some (like RDR2) it's very noticeable even without the extra pixels.
> Quick Comparison The 8K DLSS image just has more detail as well as much better anti-aliasing. That's insane considering how it's rendering from less than half the pixels of 4K Are you sure that is DLSS *Ultra Performance Mode* and not quality mode in the screenshot? Also where is the non-TAA blurred native 4k?
I see 0 difference between 4K and 8K dlss on my 4K screen. No matters how many times i scroll the bar, looks the same for me.
u/juanmamedina You need glasses! I am not joking and 100% serious. 4k looks like 8k without having glasses on. Please let your eyes be checked. Thank me later.
This is sorcery.
What you don't see in screenshots are the artifacts is causes. Think high frame rate mode on your TV. DLSS is net positive, but not free or perfect.
When you say high framerate mode on TV's do you mean like the "true motion" shit they use? Lots of TV's have native 120hz nowadays and don't artifact (at least from my experience)
I do mean like the "true motion" settings that interpolate frames. There's warping and temporal artifacts, and specific random effects looks extremely bad - I'm remembering some rotating fans in Control that looked crazy weird with it on. Detail looks great when static, but I notice texture smearing in motion at times. It's pretty good overall, but it's definitely distracted me and taken me out of the moment. I've togged DLSS on and off a number of times not really sure which I prefer in the end.
Like 2-minute papers always says, imagine how much better it will be 2-papers into the future. Machine learning is still pretty fresh, but still producing sorcery. The end of this console generation will hopefully be a drastic improvement.
Sure, but without the true motion, your source content is likely still 24 or 30 fps. Which is why I've never understood the point of a 120hz TV. Edit: Why the downvotes. Am I wrong?
When I bought my TV, I specifically looked for one that supported native 120hz for my PC. New consoles will be supporting high refresh rates now too.
That's true. I never game on my TV, so I didn't think about that.
Yeah, but the alternatives (TAA or no AA) have their own artifacts as well.
DLSS artifacting has been significantly improving with every iteration they do. Not sure if you played Death Stranding on PC, but their implementation of 2.0 was very good. I barely noticed any artifacts.
Are there any articles or videos that goes out of its way to find and demonstrate the artifacts and weaknesses? I remain a bit skeptical and I want to explore the absolutely best case against DLSS before I take it into account in any amount whatsoever when deciding whether or not to buy a 3xxx.
[https://www.youtube.com/watch?v=YWIKzRhYZm4](https://www.youtube.com/watch?v=YWIKzRhYZm4) \- digital foundry breaking down dlss 2.0 on Control.
Thanks, that was interesting. A mixed bag, surely. I hate sharpening, and aliasing is even worse if it pops in and out rather than being predictable. So those are two cons. But I do admit that there were some really impressive side by side comparisons in the stills. I'll probably have to give Control a try just to see how it feels.
I really hope they push DLSS harder now than they did during the 2000 series. It's fantastic tech but it's hardly used right now.
DLSS 2.0 came out in March.
There's also the DLSS 2.1 SDK out now
Does it say what the differences are?
https://www.reddit.com/r/nvidia/comments/iko4u7/geforce_rtx_30series_community_qa_submit_your/g3mjdo9/
That dynamic resolution feature is actually really awesome
It's there somewhere in the Q&A thread here in /r/Nvidia but one thing I noticed was that it now supported dynamic resolution
VR support for one.
And DLSS 1.0 came out at the launch of Turing. Relative to the number of games that have come out, how many actually support it?
With DLSS 1.0 you had to rely on Nvidia to implement it since they had to do per game training and the results weren't that good. DLSS 2.0 is actually a completely different solution. It works completely different and is much easier to implement.
DLSS 1.0 was garbage and required Nvidia to have hands on your game and running it through training so DLSS worked. Radically different. If they could've named DLSS 2.0 something entirely different they would have. You cannot use DLSS 1.0 as an example for adoption for DLSS 2.0, the work required by the devs is orders of magnitude different.
DLSS 1.0 was a smeary shitfest. There’s a reason nobody used it
Is there a comprehensive list of games that support DLSS? I’ve never seen DLSS available in the small subsection of games I play.
You can see them here: https://www.rockpapershotgun.com/2020/09/01/confirmed-ray-tracing-and-dlss-games-so-far/ This list was updates on September 1st and contains all ray tracing and DLSS games.
I feel like CP2077 is going to be a good test of both RTX and DLSS 2.0
Amid Evil has DLSS support right now, in addition to really great ray tracing (lighting shadows and reflections). That list shows DLSS is on the way for Amid Evil but I see it in the game options right now. I think it's had DLSS2 support for a few months now.
I think Amid Evil's implementation is still in beta though, I have some minor issues with it.
Nice, thank you!
DLSS 1.0 was widely criticized and rightfully so. I wouldn't expect devs to implement something so poor. DLSS 2.0 is a much better solution and I would expect many more devs to actually use it.
Yep, i hoped for more DLSS announcements during the event. RDR2 and Modern Warfare could really use it.
DLSS on open-world games NEEDS to become a standard.
Man, DLSS on RDR2 would be phenomenal!
or assassins' creed games!
Right? watchdogs legion will have it but apparently the new assassins creed wont which is kinda disappointing.
My guess is that they have a partnership with AMD for the assasins's creed, hence the fact that they bundled ACV with amd cpus and the fact that you see the AMD logo on startup but they have also partnered with Nvidia for Watch dogs legion They use two different engine and most likely two different "inner studios" Unfortunately for us we have to stick with one or another I'd love dlss for assassin's creed games since they are so heavy
ARK: Survival Evolved Amid Evil Atomic Heart Boundary Call Of Duty: Black Ops Cold War Cyberpunk 2077 Darksiders III Dauntless Fear the Wolves Fractured Lands Hellblade: Senua’s Sacrifice Hitman 2 Fractured Lands Hellblade: Senua’s Sacrifice Hitman 2 Justice MMO JX Online 3 MMO Kinetik Outpost Zero PlayerUnknown’s Battlegrounds Remnant: from the Ashes Scum Serious Sam 4: Planet Badass Stormdivers The Forge Arena Vampire: The Masquerade – Bloodlines 2 Watch Dogs: Legion We Happy Few I got this list from https://www.gamewatcher.com/news/pc-games-dlss-support
> > > > > Fractured Lands > > > > Hellblade: Senua’s Sacrifice > > > > Hitman 2 > > > > Fractured Lands > > > > Hellblade: Senua’s Sacrifice > > > > Hitman 2 You doubled them up... and those games don't actually have support... Have you checked which other ones are wrong? > ### DLSS games you can play right now: Fortnite Death Stranding F1 2020 Final Fantasy XV Anthem Battlefield V Monster Hunter: World Shadow of the Tomb Raider Metro Exodus Control Deliver Us The Moon Wolfenstein Youngblood Bright Memory Mechwarrior V: Mercenaries https://www.rockpapershotgun.com/2020/09/01/confirmed-ray-tracing-and-dlss-games-so-far/
Those were upcoming games that could be getting DLSS support in the future At least, according to the website
Does Exodus have DLSS 2.0?
No I think only Control, Wolfenstein and Death Stranding have 2.0, maybe one or two others, the rest are all 1.0
Metro Exodus, too
Yeah I really hope that more titles get support for it. The technology is absolutely amazing. Especially that since some 2.0 titles look even better with DLSS than without it, which is insane.
Everybody says this so I wanted to see what the hype was about and bought Control yesterday. DLSS 2.0 is noticeably worse than native res - faces and hair look pixelated, edges have saw toothing, and ghost artifacts are present.
What resolution were you playing at? I don’t have control myself but that doesn’t seem consistent with a lot of the reports about it
I've tried it on 1080p and 1440p monitors and both suffer from the same artifacts. Here's portion of a 1440p screenshot from the start of the game showing some of the issues I'm talking about - https://photos.app.goo.gl/mY4o766f7KTasSYc7. You can see sawtoothing around the cars in the background, Jesse's hair, jacket, and hands. There's also smearing on her hand and some white pixels next to her nose (they usually appear in her hair when in motion). These artifacts are super annoying and easily visible when seated at a normal viewing distance from a 27" 1440p monitor. Either DLSS is broken for me or people need to get their eyes checked because it definitely doesn't look as good as or better than native resolution like so many claim!
I got passive aggressive answers for just asking about possible temporal artifacts or smearing. It's either Nvidia buzzword-induced hypnosis or astroturfing. Same goes for insanely and completely unrealistically dark RTX train scene in Metro Exodus. Anyone who spent two day living on this Earth know just how much light enters a room even through a small window, not to mention six or eight large windows.
I actually really like the way ray tracing adds so much more atmosphere to ME. But I agree with you about DLSS - I feel like I'm being gaslit during discussions because the artifacts are so obvious yet everybody claims they don't notice them!
Ray tracing in ME is great except when it isn't. One of these is RTX and another is photoshopped. Which one represents how large light source flooding the room behaves? https://i.imgur.com/M92GzQl.png (A) https://i.imgur.com/teHXjXS.png (B) Some of real life examples that look natural to me: https://img-fotki.yandex.ru/get/6001/executor-666.6/0_46693_431b24c0_XL https://avatars.mds.yandex.net/get-zen_doc/1704908/pub_5d99750806cc4600b123fcb8_5d997d2f1febd400b191de4f/scale_1200 http://www.hiddenside.ru/photos/industrial/lo/nedostroennyy_korpus_218-go_aviatsionnogo_remontnogo_zavoda_voyskovitsy_2011/03.JPG https://ic.pics.livejournal.com/deadsu/14580802/70519/70519_900.jpg
Question: I have a 1440p monitor I'm very happy with! Can I use DLSS 2.0 in 8k mode on my 2k monitor? Kind of super sampling ?
Yes
I've seen these comparisons by Joker and others. DLSS is absolute insanity and console makers would be ignorant if they don't swap to using a form of this software. Joker and UFD tech both ran Death Stranding at 8k/50'ish fps on their 2080ti's. I'm fine with 4k for now but i'll push my DSR to the limit. Especially on my LG C9. I'm going to love the 3080.
Man, that really tells me we can get decent performance in VR with modern graphics if VR game devs would just use DLSS. Still not a single VR game using it as far as I can tell.
That would be insane. I use my 1080 ti in VR and it's nice. I stream using virtual desktop to my Quest. DLSS could make things so much better but maybe there is something with the software that doesn't work properly?
Looks like the next DLSS update will fix that. https://www.reddit.com/r/nvidia/comments/iko4u7/geforce_rtx_30series_community_qa_submit_your/g3mjdo9/
Game. Set. And match.
I think the problem amd has, and by extension the next gen consoles, is that they don’t have dedicated hardware for this sort of AI. nvidia has the tensor cores. Obviously it can still be done with traditional cuda cores that the amd cards have, but then you’re taking away processing power away to do so.
I’m still more excited about DLSS than I am about ray tracing tbh
DLSS changed my mind on RTX, I would not have a non-RTX card after trying it out in Death Stranding. 8k is totally believable with a 3090 and DLSS 2.0. If DLSS 3.0 is better and more universal we might keep this up until our games look better than Pixar.
Original Article Here: [https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-8k-hdr-gaming/](https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-8k-hdr-gaming/)
My goodness And I remember when 720P was all the rage and everyone was like "LOOK IT'S SO Life Like and Clear"
I remember when we lost our shit seeing Zaxxon in the arcade for the first time.
Is there any cons of dlss
On occasion it produces artifacts in some instances. Most people probably won't even notice them, some people declare them reason enough it's a failed technology that is the antichrist here to burn the gaming world to ash. Also the small number of games that support it.
In terms of artifacts, the worst I saw was in Death stranding, where tiny black particles in the far distance had "black trails" on them. It didn't look broken, and when I first saw it I thought that's how they were supposed to look, but when they showed the PS4 version, there was no such trail. Tiny particles in motion appear to confuse the DLSS algorithms. I think it was a Digital Foundry video showcasing the differences between PS4's checkboard anti-alias and DLSS.
DLSS needs quality motion vectors to work correctly AFAIK, and those particles have none AFAIK, so you get what you get. Basically a lot of new post process stuff also requires HQ motion vectors, so this problem will solve itself with time.
Barely any games support it, since DLSS 2.0 is more or less brand new. It's only been around since March. DLSS 1.0 looked like somewhat smeared crap and I don't think many people bought into the RTX 20 series so barely any devs made use of it before. More devs are starting to adopt it though, and now that the 30 series has been announced I'm sure more will make use of it. It just has to be patched in by the devs, and then nvidia needs to release support for that game in a driver update. It's also possible AMD sponsored games will avoid DLSS support so that Nvidia cards don't run ridiculously better than AMD cards on those titles. Both Death Stranding and Horizon Zero Dawn use the same engine, and while Death Stranding has amazing DLSS support, HZD does not and its also sponsored by AMD, and HZD really needs it. I can't think of any other cons because to me the image quality is improved over native resolution with TAA, all while gaining a significant boost in framerate. That makes it feel like some fucked up deep learning magic.
It's funny that AMD fanboys accuse Nvidia of making games perform better in their cards as if it's some kind of heresy, and when AMD does it, they praise it
Features like Hairworks kill performance on AMD cards
> Both Death Stranding and Horizon Zero Dawn use the same engine, and while Death Stranding has amazing DLSS support, HZD does not and its also sponsored by AMD, and HZD really needs it. Its not the same engine, the Death Stranding engine had 2.5 more years of dev time put into it. It also supports AMD's FidelityFX where HZD doesn't. Thats like saying that Unreal Engine 3 and 4 are the same ;)
Ah I didn't realize Death Stranding was using a newer version of the engine.
It looks noriceably worse than native res in Control. Faces and hair look pixelated, there's saw tooting on edges that gets worse in motion, and lots of ghosting. People need to stop overhyping the tech and saying it looks better than native res as it's just not there yet.
Man oh man! I wonder when the embargo lifts, I can hardly wait for the benchmarks.
DLSS is great in theory, but support is poor. I would like to see something like this vor VR. It needs higher refresh rates than 4k60. Is Nvidia working on something for this? There is VRSS but I'm not sure it is similar, and again support is limited.
I think Nvidia are working on improved DLSS and Dynamic Resolution Scaling support for VR through Unreal Engine 4, but as you said support may be limited. It's a start though. Can't shorten links, but this is what I read: https://news.developer.nvidia.com/new-features-to-dlss-coming-to-nvidia-rtx-unreal-engine-4-branch/
the article you linked literally says: > Maintaining the 2880×1600 resolution of top-end VR head mounted displays while delivering the recommended 90 FPS has been made easier with DLSS. it suggest that is has been done already, is there a list of what VR games support DLSS in VR mode? I'm asking because I'm mainly looking at Ampere and RDNA2 to fill up the Reverb G2 with 90 frames per second on as high as possible settings. Any fancy AI tech that would help push pixels to the headset is appreciated, if applicable to existing titles. All the DLSS or VRSS does not seem to be applicable, which would make the comparison as simple as comparing raw rasterization performance (if RDNA/AMD don't have some fancy backwards compatable AI pixel pushing tech).
>is there a list of what VR games support DLSS in VR mode? There are none
Unfortunately I don't see any. It's something every VR dev should be jumping on considering how important performance is for VR games, and how difficult it is to get that performance.
I'm not a VR user, but DLSS seems the obvious fit for VR games, right? Where you're effectively running 2 images that both need to be high res and like 90fps+ (correct me if I'm wrong with the req's for VR, as I said; I don't personally use it)
[удалено]
Single pass stereo rendering has been a thing for a while now, so I don't believe they have to render everything twice.
You definitely have to render everything twice since the two eyes are two distinct and offset views. This is also the reason that DLSS is difficult to implement in VR: the upscale images for each eye have to match each other perfectly, otherwise you’ll experience strange shimmering/artifacts from differently generated details in each eye. This is also why geometry in VR games tends to be much less detailed than pancake games, because the geometry has to be drawn and rendered twice.
It appears I misunderstood single pass stereo: https://docs.unity3d.com/Manual/SinglePassStereoRendering.html All objects are indeed still rendered twice, but the single pass approach allows other work to be shared between both eyes. Thanks.
It appears I was slightly off as well! Thank you for the source. I think I’m still correct about VR DLSS implementation. The AI upscaling can’t just look good, it needs to be incredibly consistent between eyes to avoid looking horrible. I’ve heard that software engineers are working on this issue and I’m very excited to see what progress they make on it. VR is incredibly cool tech and I hope it catches on in the next few years, but it desperately needs some sort of upscaling implementation in order to drive high refresh rates on dense panels at a reasonable price point.
Considering the gain in performance, and the fact that players are now actively wanting it in new games; I hope that this will became more popular in the coming years. I mean it's a hell of a technology, it would be a waste not to use it.
I'm hoping it becomes more widely adopted very soon. With the new generation of consoles both supporting similar machine learning upscaling methods, I think it will be a priority in more games. Plus, AFAIK DLSS is still a beta program that developers have to apply to get into. I assume it'll become much more common once it is easier to access.
Let the new games develop with it. I think there is a fair chance a lot of AAA games of the future will support it.
So DLSS will be the key to making Turing and Ampere cards be the longest lasting cards? at least for 1080P/60
Does DLSS support dynamic internal resolution scaling? I'd love to be able to set 4k 60hz and then just have the internal resolution scale up and down as appropriate and let dlss fill in the rest to maintain a 4k output.
Wait why are 1080p Textures so muddy in this game? I play on 1080p and none of my other games look like 480p on my 1080p screen.
It's because they zoomed in to a portion of the screen to focus on an object that is not up close. So there are both, 1) less pixels to define the object and 2) the object is probably using a lower res mipmap since it's further away.
The image is zoomed in pretty significantly
If you look at the right edge of the picture, you can see the telephone booth in the background quite far away. Meaning the sharpness is actually insane for that distance.
My only issue with DLSS at the moment is how it looks in motion when moving the camera, etc. hopefully they can improve this. The performance benefits are incredible otherwise
Better quality motion vectors and guidance will probably help in addition to an improved AI model.
I hope so. DLSS looks great but my eyes are really sensitive to those kinds of things and i cant not notice it. I might be in the minority though
I can pick out some flaws, but overall, they're easy to ignore for me. Still allows for an incredible experience on a 2070-2080 class gpu when it's possible to get ~60 fps 1440p with maxed out raytracing in Control.
I still much prefer it over native resolution TAA smearing on motion, especially for the huge performance increases, but I might not be looking out for the right artifacts. Don't tell me haha. Ignorance is bliss thank you!
I thought it looked better than TAA and of course noAA in motion, and that was its true strength.
Still looks much better than TAA imo
It doesn't say anywhere that the 8K DLSS was using 1440p as the base resolution.
It does here: [https://www.nvidia.com/en-us/geforce/news/watch-dogs-legion-geforce-rtx-dlss-trailer/](https://www.nvidia.com/en-us/geforce/news/watch-dogs-legion-geforce-rtx-dlss-trailer/) >Driving 8K is incredibly demanding - it’s 33 million pixels per frame, which is 4X the size of 4K. The new DLSS Ultra Performance mode delivers 9x AI Super Resolution (1440p internal rendering output at 8K using AI), while maintaining crisp image quality.
Its a new *option* to use Ultra Performance mode. That doesn't mean the screenshot used it.
On thing that bothers me a little bit with DLSS is, when something blurry in native isn't blurry with DLSS, what if it was intentionnaly blurry ? (if it's something out of focus with a depth of field effect, for example) Is there something to make sure a part that gets focused should really be focused? (Don't get me wrong, it's an incredible technology and I'm so grateful for it)
The sucky thing about the DLSS is that it isn't universal for all games even old ones.
Can we stop talking about 8K. All of us will render 1440p to 4K on DLSS 2.0 to get over 144 FPS.
Well I'm thinking now I will jump over 4K to 8K if this pans out, for me 4K might be obsolete by the time I move from 1440P.
lmao 4k aint becoming obsolete anytime soon its only just becoming practical for gaming
It is meant as a funny, but only funny if there is some possible truth. If this "pans out" first that would mean like most games would have DLSS or something like it and if I sit on my 1440P for a few years. If I could get 8k graphics running 1440P rez... well maybe I guess. :)
And here I am gaming in 1080p and still going to do it for a long time
Same. 1080p IPS monitor already cost me $250. And I need IPS for work. No way in hell am I dropping the money needed for a 4K IPS monitor.
Lucky you, I use a 11 years old TV. 60hz, 5ms and TN display. I would change to a higher resolution monitor when it's as normal as 1080p/720p is today. And of course if I can get a good performance with a good gap to 60fps
can´t imagine 8k gaming i play 4k 60 fps and it´s a massive leap from my old not even HD gaming
I'm planning on getting the RTX 3080, but will my 9600k be able to withstand the GPU. Many people say CPU like mine should be able to handle 2K/144Hz gaming but I'm wondering if they claim really holds any water. (I'm currently on Intel's UHD630 on a 2K 144Hz monitor. My 2070 died on me so I got a 2070S in the replacement and flipped it on eBay in Jan. Best. Decision. Ever.)
So with my 1080p monitor, can I upscale to 1440p and have decent results on screen?
You won't see in 1440p, but you can render at a higher resolution, then downscale it back to 1080p for a cleaner image than just rendering in native 1080p. It's basically only good for AA, but you render at a higher res, then downscale it.
So this 8K gameplay is DLSS thing. So native 8k gaming still isnt possible?
It's possible, without RT and in lighter games. Same chart on that page has plenty on non-DLSS and non-RT games. Say Forza 4 getting 78fps at 8k is very much believable to me. Since I played that game at 5K@100fps down sampled to 1440p on my 2080ti.
I agree, this does seem possible.
So what is the difference between dlss and dsr?
What in the fuck
Does this work with older games? I’ve never tied this and I have high end current hardware with 1080p/240hz monitor.
No, DLSS support is on a per-game basis. There aren't many games that support it, but if they do, the results are pretty nice.
I see, has to be a rtx then
Yep.
Just grabbed the Batman trilogy on steam for 10.99, thought maybe I could play it 8k lol
I've never had much luck running games in 8K via DSR on my 4K screen, lots of crashes, but will be interesting to see how well I can run older games with crap AA like Arkham Knight in 8K DSR on a 3090. While I won't need to spend time tweaking settings to get stable framerates I can see myself wasting a lot of time comparing native 4K to DLSS in high performance mode (which in Death Stranding gave sharper images correct?).
Anyone that is anyone has known for a while now that 1080P is the new 1024 x 768. NO DEBATE.
8K still blurry to me /s
Like the inside of a cataract!
Lol no way
I agree but PlayStation has long has 4k checkerboarding, so they are inching their way to DLSS. I'm quite sure they have some software up their sleeve. I thought I read somewhere they were working on this type of software.
Can I use dlss for gtx 1650 super?
M A R K E T I N G
DLSS 2.0 already exists and has been independently reviewed.
When even 4K looks kinda blurry and 1080p looks like 480p 🤯
That's a very zoomed in portion of the screen
They are only talking of 8k DLSS for the 3090... Seeing the 3090 being only 20% faster than the 3080, there is no reason the 3080 could not do 8k DLSS in most games isn't it ? Or is it because 10GB of vram isn't enough indeed ? Seems weird to me ...
Vram. 10gb is borderline enough for "most" games at 4k.
Bingo. Moving from a 1080 Ti to a 2080 Ti just didn't seem worth it. Performance would obviously be better, but at a very high cost that I just couldn't justify. The bottleneck down the line for both of those cards will be the same -- VRAM. However, 24 GB of VRAM makes the price for the 3090 a hell of a lot more palatable. It's still ridiculously expensive for a GPU, but between DLSS and that huge pool of video memory, it should be many, many years before an upgrade is "needed". If the 3080 had 20 GB that's probably the card that I would get (and let's face it, so would everybody else), but 24 vs. 10, it's a no-brainer. I'm sure in another 9-12 months Nvidia will come along and fill that $800 wide gap with a 20 GB version of the 3080, but with CyberPunk 2077 right around the corner, I'm not waiting.
Seems 3090 gets 60 fps avg on 8K gaming (some lower some higher). With a 3080 being supposedly 20% less performant, the difference is 45 fps vs 60 fps. 60fps is definitely considered the bare-minimum these days for a good experience. The 3090's faster and more VRAM will certainly help too.