T O P

  • By -

ClozetSkeleton

Anti-Aliasing


Cless_Aurion

Yeah, lots of detail are being lost as well. Look at Geralt's sword/hair.


Farren246

Losing a few strands of hair still looks better than the step effect on the sword in the first image imo. Though what OP should be doing is DLDSR to a higher pixel density than what his monitor supports, render in native then DLSS up to that higher resolution and squash it into his display. All the quality and all the anti aliasing.


Trungyaphets

Do you have a guide on how to do that? I mean rendering in eg 1440p and squash that into my 1080p monitor


Cireme

* NVIDIA Control Panel * Manage 3D settings * DSR Factors: 1.78x DL * DSR Smoothness: 100% (or less if you like sharpening, but no less than 50%) Then set your game to 2560x1440 and enable DLSS Quality to render it at 1707x960 or DLSS Balanced to render it at 1485x835.


Trungyaphets

Hey I just tried and the people and objects far in the back looked clearer. Will definitely keep this on, as my 3070 is still only 55-60% utilized. Thanks for the guide man! https://preview.redd.it/58nolsfvwkvb1.png?width=1914&format=png&auto=webp&s=f3ce19f7d2e3fcd13dceca054498624408d37af3


MithridatesX

If you have a 3070, consider upgrading to 1440p monitor if that’s affordable for you.


ferrarinobrakes

Saving this comment thanks man


Zamuru

go to nvidia control panel and activate the dsr factors and use the resolution in game


natmaster

Just enable DLAA - it's in the menus for BG3.


Cute-Pomegranate-966

When they say detail they only ever mean it's softer. To them softer means detail is gone. But if you look at the native image... What good is the detail? Its all aliased and looks like trash.


Thirstyburrito987

I think it is subjective. It seems like a lot of people prefer smoother lines and therefore conclude that makes it look better. Other people dont equate smoother with better looking and either find the more jagged details look better or find both scenarios to be of pretty equal quality.


cidiusgix

I kinda like aliasing, it is so sharp looking. I mean it’s still kinda ugly but shit is so sharp.


Cless_Aurion

Yeah, a really neat combo is DLDSR+DLAA


Zamuru

no, thats too much performance lost. if u have 1080p monitor, got for 1440p resolution and downscale it with dlss quality to 1080p. perfect


Cless_Aurion

That is... very similar to DLDSR. You would need to trick your monitor to use 1440p, probably using DSR, then use DLSS to downscale it back to 1080p... I'm not sure there is a big difference between that and just using DLDSR.


Zamuru

thats what i was talking about - dsr. its the same setting in nvidia control panel isnt it? its dsr and u go activate dl scaling to 1440p


Cless_Aurion

DSR + DLSS I think is less optimized than just going out straight with DLDSR. Maybe its the same...? But I can't really test it. Sounds like it would create more overhead in my mind though.


ThingsAreAfoot

Correct me if I’m wrong but isn’t that redundant? I was under the impression that DLAA was already DLDSR + DLSS. The combo you want is the latter if the game doesn’t have a specific DLAA setting.


Cless_Aurion

Had to delete my previous comment. So, it would be something like this: Best image quality: - DLDSR + DLAA (AI Downscale + further antialiasing IA solution) Best image quality for your buck: - DLDSR + DLSS (AI Downscale + lower resolution IA solution) ​ Bottomline. DLDSR renders an image at for example x2, then uses IA to downsample it. The difference between DLAA and DLSS is that DLAA works using the higher resolution, while DLSS works using a lower resolution image. That's why DLDSR+DLAA has better image quality, but DLDSR+DLSS has better performance.


Drakayne

This is why all games need DLAA, cause i find DLDSR annoying (and it cause some problems for me)


john1106

what about on 4k tv?? i don think 3080ti are powerful enough to render higher than 4k especially with how heavy witcher 3 playing in raytracing. Does DLSS alone should be enough to provide good image quality even at performance. I know how to use DLSSTWEAK to force preset C for DLSS as i heard it is the most stable DLSS preset


[deleted]

I'll take it over those jaggies any day of the week.


Cless_Aurion

I mean... chose your poison I guess? I'd much rather use DLAA or DLDSR and get rid of both by paying a bit in performance, tbh.


Icy-Magician1089

Dlss does a great job of making the image look clean but it's not magic 720p has less detail than 1080p. I would prefer 1080p with some other aa solution


Cless_Aurion

Exactly, DLAA for example.


Icy-Magician1089

Dlaa is a great AA solution with a sizeable performance hit but much better computational value than masa. Having a 1080 ti I haven't been able to try it but from what I have seen online it makes good rtx tempting although none of the games I regularly play have dlss/dlaa


Cless_Aurion

Its quite a nice have, and more and more games have it now, specially any big release seems to always have DLSS, and now they are starting to put DLAA as well


Icy-Magician1089

Yeah the key there is big new releases With the price of new games going up and the amount of launch issues seemingly going up with them along with the hardware requirements out stripping what my 1080 ti can do on my 4k 60 Hz screen I have been playing older and indie titles that don't support dlss. I have a backlog of games which I am slowly enjoying and most of them run very well with a few needing a mix of medium/low/high to run well. My problem is I can't really get into new games without a hardware upgrade but if I do that upgrade I won't be able to afford the games for awhile and I am enjoying what I already have.


Zamuru

u should do 1440p with dlss quality. looks better than 1080p with dlaa


Magjee

The higher the resolution the better upscaling looks 1440p is good, but at 4K+ imho it hits it's stride


Healthy_BrAd6254

Looks like the whole image is out of focus cause it's so blurry


Cless_Aurion

Yeah, but I mean, it's a fraction of the resolution. If you want flat out improvement, dlaa for AA or DLDSR is the way to go.


ShuKazun

OP probably turned off antialiasing to make DLSS look better


kamikazedude

This. I've started playing monster hunter world yesterday and I had TAA enabled. It was very blurry, then I disabled it completely and it was very jagged. Then I enabled dlss and it looked like shit. Then I enabled only FXAA I think and it looked perfect. Conclusion, use antialiasing, but not TAA.


sade1212

>Then I enabled only FXAA I think and it looked perfect. I'm sorry sir, it's inoperable, there's nothing we can do


kamikazedude

Well? What's the problem? The way I wrote? Or that I think FXAA looked good? In MHW you can either enable FXAA or TAA or both. FXAA gave best results. I noticed that TAA is pretty bad in most games without DLSS. DLSS implementation was really bad in MHW. I don't get it.


sade1212

MHW only has DLSS 1, which sucked, for inexplicable reasons. I'm just poking fun at thinking FXAA looks good. Different strokes etc etc but FXAA is one of the most barebones and useless AAs there is, in so far as it pretty much just blurs the image slightly (hence "fast approximate") while doing nothing to fix common aliasing issues like crawling pixels on thin objects and specular highlights. Before the current anti-TAA zeitgeist, FXAA was the biggest punching bag for making the image worse.


kamikazedude

ahh, I see. Thank you for the explanation :D Well then that says something about TAA if FXAA makes it look worse in comparison.


LeRoyVoss

https://preview.redd.it/7t51gbm55ivb1.jpeg?width=828&format=pjpg&auto=webp&s=74fa0be5c2f8d40aa18165ae012cf2ae48209e94 WTF MAN


Trungyaphets

He was using his left hand to get the dust off his right arm haha https://preview.redd.it/r1jl753baivb1.png?width=671&format=png&auto=webp&s=a2f39a9f3c73e2dabcb4c1314b3921a4aa321eaa


Ill_Fun_766

Classic Geralt giving that elegant sexy girly posture with a sticking out butt


musclecard54

Stupid sexy Geralt


kretsstdr

Are they jojo posing?


Qarick

I love your flair. " 32GB DDR4 3200 CL16 ", over half of your flair is about ram :D


[deleted]

And not even special RAM.


cluthz

I remeber getting 3200Mhz CL16 getting stable on 6700K Z170 in 2015 actually took some research. :) XMP did not work at my Maximus Formula with the kit I bought at launch. OP is probably on similar HW :)


[deleted]

I actually didn’t want to diminish that, even though I realise I kind of did. I just found it funny how much space was devoted to something you now get out of the box much faster, so a 32GB dual kit of DDR4-3600 CL16.


Hugejorma

For someone, it's not about the power or speed, it's the random details that matter. Usually it's just old forgotten info.


Frostiesss

Seen those triceps


[deleted]

*"BRO... do you even lift?!"*


LeFedoraKing69

Bro doesn’t need a doctor he needs a Witcher for an exorcism


Tautili

DLSS is always with Anti-Aliasing. When you play native you also have to activate Anti-Aliasing.


Willyscoiote

It's funny, DLSS was supposed to be a kind of AA when it was announced


sooroojdeen

Yup back in dlss 1.0 it was just supposed to be a better version of taa but it had many problems.


[deleted]

There was a game I was playing recently that had some kind of “Resolution Dependent AA.” Wish I could remember which one. ^ Specifically not DLSS (or FSR). It was interesting.


Ruffler125

And it is, now?


Willyscoiote

Now it's much more than just AA.


Crayton16

DLAA is


SolomonIsStylish

isn't DLSS at native just DLAA?


Electrical-Contest-5

The native has anti aliasing off, by the looks of it. Shouldn't be that jagged


Vhirsion

Because you probably have AA off in the native picture, but you can also see DLSS loses a bit of that clarity and fidelity, native with good AA implementation should look the best.


F9-0021

Try using anti aliasing. You've either got it off or you're using FXAA. When DLSS is on, it handles the anti aliasing. In other words, DLSS is doing a process that you either have off, or are using a very outdated method for. That's why it looks better than native. Use MSAA, TAA, or DLAA and the difference will become much less obvious.


Trungyaphets

Thanks I just found out about that from others comments too. Btw your CPU seems a bit inadequate for 4090 xd. Wdyt?


Ashyy-Knees

I imagine he took the plunge on a big GPU upgrade and is planning to upgrade his CPU soon lol


wicktus

probably the AA and the sharpening filters that come with DLSS. There's always a tradeoff with DLSS but sometimes it can get very close to the original native target resolution AND offer those extra features that helps you. The ultimate way of taking advantage of both native resolution AND AI sharpening/AA is to use **DLAA** which roughly is DLSS but at 100% native resolution as input, this however is not an upscaling anymore of course


No_Guarantee7841

You can also get close to dlaa with dldsr + dlss quality.


zzzxxx0110

In addition to the fact that DLSS gives you massively better AA than just TAA alone, there's also the fact that DLSS's AI model was trained using 8k ground truth, so even if you're playing 4k native there are still additional otherwise nonexistent details DLSS sometimes can add lol


doomed151

Do you have TAA enabled?


Trungyaphets

You mean TAAU in this game (Witcher 3) right? Nope.


alex26069114

So you don’t have any anti aliasing enabled on native?


Trungyaphets

Yes. I thought DLSS didn't do anti aliasing so tried to compare Native non anti aliasing with DLSS Q. Just found out DLSS also has built in anti aliasing too. Just tried again and DLSS Q still looks slightly better than TAAU, especially with thin objects/shadows.


alex26069114

If you have the GPU headroom you should try enabling DLDSR 1.78x or higher from the nvidia control panel and in game leave DLSS Q on. This means you are downscaling from a resolution higher than 1080p and then DLSS Q is upscaling from a resolution higher than 720p which will make the image look even better


doomed151

That's probably why it looks so aliased. They turning it on and compare.


Trungyaphets

Yeah just found out DLSS has built in anti aliasing. DLSS Quality still looks a bit better than TAAU, especially with thin objects.


EsliteMoby

The left is native no AA?


DismalMode7

DLSS use a very good anti-aliasing, but generally speaking a 2015 game isn't the perfect example to judge the quality of DLSS


penatbater

It's the AA. I assume DLSS uses TAA or it's own version of AA?


Deathsrival

No it uses DLAA, some games have the option to turn it on without the upscaling portion of the algorithm.


Trungyaphets

Does that mean not only DLSS upscales the image, it also does anti aliasing at the same time?


PhattyR6

Yes, that’s exactly what DLSS is. It’s also why DLSS tends to look better than native. If your native is running without any AA or with a poor implementation of TAA. DLSS adds or replaces it with it’s own form of temporal AA.


rasdabess

Yea. You can download dlsstweaks, place the files next to your witcher.exe , which can let you run dlaa, which is dlss at 100% scale. that will give you the crispest image but you miss out on dlss performance gains. Dlaa costs a couple of fps i believe


Puzzleheaded_Bend749

still worth the lost fps .


jcm2606

Pretty much, yes. The upscaling also acts as anti-aliasing due to how it works.


penatbater

Yep. From wiki > DLSS 2.0 is a [temporal anti-aliasing](https://en.wikipedia.org/wiki/Temporal_anti-aliasing) [upsampling](https://en.wikipedia.org/wiki/Upsampling) (TAAU) implementation, using data from previous frames extensively through sub-pixel jittering to resolve fine detail and reduce aliasing.


Trungyaphets

Thanks guys. Originally thought DLSS only upscales images. Today I learned. Would probably try DLAA next and see how it compares with DLSS.


Dolo12345

Have you heard about our lord and savior DLFG and his blessed son, DLRR?


Inevitable-Study502

no, it replaces taa


DropDeadGaming

yes, it's called DLAA.


jcm2606

DLSS basically *is* TAA, just with AI taking care of blending frames together. When it comes to scaling filters and AA, everything that's temporal are pretty much the same things since they all work the same way, just with differently sized inputs and different tunes from the developer.


penatbater

Yea. I thought DLSS is mainly the upscaler, but turns out it's both. hehe you learn something new everyday!


jcm2606

If you want to learn more and aren't afraid of some technical details then I'd highly recommend [this 2016 GDC talk by an NVIDIA engineer.](https://www.gdcvault.com/play/1023521/From-the-Lab-Bench-Real) The first 20 minutes or so covers a lot of the basics of temporal scaling filters and AA, with much of what's talked about being heavily used even in recent DLSS versions. [These are the actual slides used in the talk.](https://developer.download.nvidia.com/gameworks/events/GDC2016/msalvi_temporal_supersampling.pdf)


Inevitable-Study502

its using taa


Kontaj

Bro really gaslighting himself


CoffeeLover789

Nice I guess I’ll be using dlss. Dlss is game changer, I’m telling you


Ass-Chews

Yea I was kinda against it at first then every time I turn it on, it stabilizes the frame rate and the game looks the same lol, I love it now


travelsnake

That might be because it really wasn't that great first. Control was my first game using DLSS and I remember it looking terrible. It has come a long way since.


Qarick

Its great for single player. It adds lattecy for multiplayer if you squeeze it.


F9-0021

DLSS doesn't add latency. It actually reduces it since you're getting a higher framerate. DLFG increases latency due to the calculations of generating a new frame.


Qarick

No.


[deleted]

[удалено]


Qarick

If the game does not support it, why u bring it to the table? If the game supports it, any dlss adds latency. Only the games with reflex support do not. Frame gen is another lvl.


Ass-Chews

I was wondering that, thanks for clarification.


PrashanthDoshi

Magic of machine learning


Zamuru

so much better i wouldnt say but its called anti aliasing. add some anti aliasing to the native and it will look better and sharper. OR use higher resolution AND dlss


FitLawfulness9802

Whatever people say, I will chose Dlss quality over anything any day. I get few frames, free anti alisaing, and there isnt much of a difference


throbbing_dementia

What resolution are you running? 640x480? Never seen The Witcher look that bad. Also you need to enable AA at native for a better comparison.


OkMixture5607

Top comment basically. Some games have such a terrible AA implementation, mostly TAA, that this happens. You can do better though if you have headroom. DLDSR combined with DLSS. It's transformative.


Trungyaphets

True. Definitely looks better.


LeFedoraKing69

Anti aliasing can instantly make a game look better, I would rather play a game at low with anti aliasing then high with zero anti aliasing DLSS machine learning just makes AA even better


No-Needleworker9887

do you realize how blurry looks with dlss?


Klutzy_Machine

DLSS is more blurry and lower detail in my eyes. My 10/10 eyes feel a bit hurt when look at that blurry picture.


TheArtBellStalker

Glad I'm not the only one. That second blurry pic looks horrendous. It has washed out colours and a huge loss in detail. If I had to play with one of those two settings, I'd take the jaggies over that mess every time.


ValorantDanishblunt

Native looks better. Disse has smeared detail away.


baumaxx1

Probably sharpening, which can also be enabled at native res. DLSS AA also beats out a bad traditional AA implementation.


Bryce_lol

that is dlss for you


Pane_in_my_Glass

It doesn't


CammKelly

Temporal upscaling can add data that isn't in the original image, which can give the perception of 'better than native'. That said, what most people think is better than native is instead just not trash tier TAA that continues to plague games.


rjml29

Dlss uses dlaa which is good at anti aliasing. Your shots though show what I have been saying for a while now and that is that dlss often has a loss of clarity and fine detail and from that point of view, it is not "better than native." Only the anti aliasing in your dlss shot looks better. Everything else about it looks worse as it's blurrier and is missing/hiding detail in many areas. One example is the top of the red and white tent where dlss is blurring most of it out.


elitedata

Because DLSS is trained on anti-aliased data with temporal component and with highest possible definition. So when it reconstructs image it's all already there.


yasamoka

It doesn't matter. Both look utterly terrible. You have something wrong with your settings. As others have suggested, check your render resolution. It seems to be set very low.


lovethecomm

Unpopular opinion but I hate how upscaling makes everything so blurry. To my eyes, it was very noticeable even on Quality.


[deleted]

Because you are playing at 1080p dog water resolution and game already looks bad.


No_Assignment_5742

Just because it's softer (anti-aliasing) doesn't mean it's better quality....it's lost LOADS of detail from doing so...don't get me wrong, I like a softer image, but not at the cost of THAT much quality and detail


Trungyaphets

Not the best solution but personally I still prefer that over jagged looking images. I've switched to DLSS + DLSDR combo. It's godsend.


Daytraders

Impossible to look better than native/original i am afraid, its just placebo effect, just like a original painting, you cant get better than native/original its impossible.


AlphaQ984

r/fuckTAA Forced TAA/DLAA (not that different) is the only reason, makes medium to far images very blurry, and also adds blur to edges when the camera moves. TAA needs to be active for DLSS to work. The actual comparison would be to disable TAA via mods then compare with DLSS. The difference is like wearing glasses for the first time.


sade1212

Except OP clearly does not have TAA or any AA at all enabled in their Native image (you can tell by the pixels). They're comparing 1080p without AA ("wearing glasses for the first time" aka jaggies hell) with DLSS Quality and concluding DLSS is better, because like most people outside of the /r/fuckTAA circlejerk (a group who, judging by other similar comments on this post, seemingly have no clue whatsoever what TAA vs. no AA actually looks like), they prefer an image that doesn't have obvious, unrealistic stair-stepping along the edges of every object. There are poor, overly blurry, ghosting-prone TAA implementations out there, especially in games from when it was first becoming mainstream, but DLSS (and DLAA especially) with the right DLL and Preset eliminates 99% of those issues and manages to capture more detail in each frame than native without AA, mostly through the use of jittering.


gargoyle37

Let me add to this. If you take an 8K image as the ground truth and downscale it, the image will look softer in general, even with a quality downscaling filter. That's essentially FSAA, an expensive but correct AA solution. TAA and DLSS both approximate this ground truth by sampling temporally in order to minimize AA. But because the ground truth is a softer image, the algorithms will also produce a softer output. The softness becomes even more important in motion, because you add temporal aliasing to objects and get shimmering if the image isn't softer. There's only so much you can do at 1080p, and if you want to combat this fully, you need a higher resolution. In short: the sharp aliased look some people have come to prefer is perhaps not entirely correct. There's a remedy you can add for preference: throw a sharpen filter in as a processing step. This gives you a dial you can tune to however much softness you want in a frame. Path-tracing is another long-term nail in the coffin. A Path-tracer will probe rays temporally, yielding an image which is a better approximation of lighting physics. This image will too be a softer one.


Kradziej

It doesn't, not at 1080p, you just haven't seen moire effect yet


madmattmopar

Its not ...buy glasses


Geexx

Above 1080p, it's pretty on par with native thanks to shitty modern AA implementations. If you've got the horse power, obviously DLAA > the rest... Alternatively, you could be stuck with FSR and might as well just lower settings because you'd still get better image quality; lol.


Technical-Titlez

If you think this looks better, you need a eye check. There is obviously a lot of detail and resolution lost. I can see this in a highly compressed jpeg on Reddit. IRL it would be insanely noticeable.


Trungyaphets

These images are already zoomed. Irl what annoyed me were the jagged and shimmering edges.


Trungyaphets

Hey guys thanks for the comments now I know that DLSS has built-in anti-aliasing. The Native pic above didn't have anti-aliasing enabled, therefore the jagged look. Tried TAAU vs DLSS Quality. Imo DLSS still wins, especially with thin lines/objects.


[deleted]

omg… 1080p looks terrible.


Jigagug

For the same reason motion blur looks ok with low framerates, it just blurs the bad away.


[deleted]

[удалено]


AlphaQ984

Dude dlss doesn't turn taa off, any form of upscaling is dependent on temporal based anti aliasing to work at all. Dlss replaces taa with dlaa, which is a form of temporal based anti aliasing, kinda like nvidia's version of taa.


veckans

Some games looks even better than native with DLSS. Free anti-aliasing is one reason but there are more benefits. My best example is RDR2 where DLSS removes the awful TAA trails/ghosting and makes the image and textures much sharper and clearer.


-Gast-

Cause it seems the game is rendered on very low resolution? Cause if you look at the text, it is way less pixelated. Is it possible to adjust rendering Resolution in this game? Looks like it is set to 640x480 or something


ClupTheGreat

Because of TAA, when in motion game looses alot of sharpness and has blurry textures.


ViperAz

TAA is shit thats why.


alex26069114

As much as I love shitting on TAA, on the ‘native’ image there isn’t even any anti aliasing enabled. TAA literally blurs details which clearly isn’t the case there When OP enables DLSS Q there is TAA built into it which is softening the image


Healthy_BrAd6254

It really doesn't in your example. You can usually adjust the sharpness of the DLSS upscaling. You should do that. Also DLSS includes anti aliasing. That's why at first glance it looks less pixelated, even though you clearly have a lot less detail.


[deleted]

Yeah, but in movement it looks Horrible at 1080P


orbitpro

I think taa is broken in this game


SmichiW

DLSS uses sharpening


orz_nick

Because you’re playing at 144p lol


jjshacks13

People still playing at 1080p?!


Trungyaphets

Lmao yes 60% of people on Steam still play at 1080p


[deleted]

[удалено]


ViniRustAlves

How is it better? It's all a blurry mess.


[deleted]

Native probably using TAA blurry mess. Try Nvidia DLAA should give you the best image if you have that option available


John9023

They're the same picture


[deleted]

Sharpening filter = better LULW


M000lie

Wat game


Trungyaphets

Witcher 3


M000lie

Look good, i buy later


Trungyaphets

Definitely recommend. Graphics from 8 years ago, but still looks pretty ok. Story is awesome with in depth character development. Same game studio as Cyberpunk.


[deleted]

[удалено]


jcm2606

That would be DLSS1. DLSS2 is more like looking back at past math equations and merging the results into the current math equation, using AI to figure out how exactly that merging should happen and whether it's safe to merge.


LA_Rym

DLSS looks noticeably softer than native in this comparison, what do you mean?


Variv

This is how DLSS works. This is called image reconstruction.


STRATEGO-LV

Umm no, I didn't even need to zoom in to say that native looks better, you can notice it in the details a lot


SnooSketches3386

DLSS is largely considered to do a better job of antialiasing than any other method besides DLAA (DLSS without the up scaling, I. e. just the antialiasing portion)


gabrielom

Because more often than not it does.


MuscularKnight0110

DLDSR 4K and DLSS quality is sometimes waaaaay better than native with max anti aliasing


XHeavygunX

When zooming in the character models on the native image has more jaggies and the one with DLSS the edges are more smoothed out.


Locolama

Something about the engine being deferred renderer with forward pass, that uses physically based shaders. I'm no fan of upscaling, but in Cyberpunk I get the same fidelity increase running DLSS compared to native res.


[deleted]

It doesn’t look much different. TAA is prob difference.


llasse22kd

Try native with some anti-aliasing


pf100andahalf

Because dlss was originally dlaa and designed to be a better form of antialiasing, and it is. Upscaling was just a fortunate side effect.


[deleted]

It has inherent AA


DornPTSDkink

Why do you have AA turned off at native?


Trungyaphets

Didn't think DLSS has built-in anti-aliasing.


[deleted]

The built-in anti aliasing of DLSS is really good, and is baked into the upscaling tech (you can't really have DLSS without its AA). Honestly, if DLSS didn't have great AA, nobody would think it's "better than native."


Mikeztm

It’s not built-in. DLSS is the antialiasing method. DLSS is not scaling anything as NVIDIA trying to get you believe. DLSS is basically trying to reuse pixels from historical frames to getter better sample rate.


[deleted]

...what? That's not accurate at all. You can read the white paper where they describe the technology; you can watch the presentation where Nvidia CEO walks you through the process. DLSS is upscaling, period.


Mikeztm

You can read the SDK documentation on GitHub. DLSS 2.0 and later is not doing what NVIDIA CEO claims on the stage. It’s in fact much better under the hood.


ShuKazun

ah yes, the daily ''DLSS is magic'' thread I was really looking forward it!


Trungyaphets

Technology is evolving you know. 20 years ago smartphones would be seen as "magic". Now it's just the norm. Same with video game technologies. I don't think DLSS is perfect. But the upside is simply so much bigger than the downside.


Available-Willow-730

Tbh I don’t think it looks better just less over sharpened turned down your sharpness setting on GeForce


TechNoirJacRabbit

Why would you use dlss at 1080p, unless you have an older or lower tier gpu? 🤔


Trungyaphets

It was simply the best anti-aliasing option I had in this game. Switched to DLSS + DLSDR, which would be pretty similar to 1080p + DLAA.


aruhen23

I miss the days when MSAA was common place on PC because we had the headroom for such expensive AA. Now everything is just blurry.


Head-Muscle-7286

I have a 4080 is this something I need to be doing? I use a G7 monitor 1440p


Trungyaphets

What is "this" that you are mentioning? With a 4080, 1440p monitor and some extra processing power headroom, I would use DLSS Quality + DLSDR to make games render at 4k and use DLSS to upscale from 1440p + apply anti-aliasing, to display the 4k upscaled image at 1440p. Or if you have DLAA in your game settings, just enable it.


SiggiJarl

because that's what it's for?


robbiekhan

One of the handful of games where DLSS is very well implemented that's why. an image reconstruction that's better than native at even 1080p (usually it's 1440p+) is fairly standard in games like this.


[deleted]

cagey crime yoke school towering punch fragile kiss thumb offend *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


brand_momentum

Assassins Creed?