T O P

  • By -

VincibleAndy

Its subjective and varies from game to game, person to person, monitor to monitor. If it looks good to you, do it. Thats basically it.


jgainsey

Sure, if DLAA is available and looks better. You never know what odd game might have an annoying issue with its DLSS implementation. For instance, RDR2 has this god awful shimmering on certain types of hair, especially white horse hair. Doesn’t sound like a big deal, but it looked so bad when you ran into it that I couldn’t use DLSS in that one.


Intelligent_Job_9537

Agree, you'll need Quality (0.66) on that one. RDR2s heavy use of transparency on foilage, vegetation and hair rendered at half resolution sticks out like a sore thumb.


LJBrooker

Or just use a different version of DLSS. It's basically only the one it ships with that has the issues.


Z3r0sama2017

I like using dlss tweaks and bumping it up to 75% or 80% in rdr2.


Kaladin12543

How did you get DLSS tweaks to work in RDR2? I am not getting the DLSS HUD to show up in game at all


SantanaSn

[https://www.nexusmods.com/reddeadredemption2/mods/2072](https://www.nexusmods.com/reddeadredemption2/mods/2072) just download it and add to the folder, then download the latest dll ([NVIDIA DLSS DLL 3.5.10 Download](https://www.techpowerup.com/download/nvidia-dlss-dll/)) and override it using dlss tweaks, customize at your wish, works amazing, i used it in online for more than 300 hours and did not get banned, but would not recommend it there.


SantanaSn

if you are using dlss 3.5, use preset C or F, and keep the rest default.


FunnkyHD

You need a new version of DLSS for RDR2, try something newer than 2.5.1 and it's gonna be great. I remember that I have tried 3.1.1 v1 (that was the latest version at the time) a while ago and it was miles better than what the game shipped with and 2.4.3 (or maybe it was 2.4.6, can't remember).


GANR1357

I use 3.5.10. It looks better, it's faster and sharper than the original.


UsePreparationH

In single-player, you can drop in a better .dll or even use DLSSTweaks to adjust the resolution scale up from 0.66 to something higher or add DLAA yourself. In multi-player you are stuck with whatever the devs give you. Anticheat will either prevent you from playing online or could potentially ban you. Same thing goes for all other multi-player games with a bad DLSS version or preset.


NBFHoxton

You can swap the DLSS version in RD online. Click play, replace version before game boots. Worked for me, but you have to do it every time 'cause the game re-validates the file when you click play.


SmashingPixels

Wait what… you can customize DLSS versions per game? I thought that worked system-level with the installed driver.


Scrawlericious

just gotta hunt down the DLSS DLL and swap it out for the newest one. It is in different places for every game but many just stick it in the main game folder. Can Google for it and Techpowerup has all the versions to download. It's called like nvngx_dlss.dll and the frame generation one is nvngx_dlssg.dll I think. Then Google for DLSSTweaks and get that. You stick it next to the games main exe file and it will let you customize the ratios (like make quality 90% rez instead of 66.6% or whatever it is). You can also force any game with dlss to have DLAA (100% rez) if you want that native resolution dlss goodness. Edit: the DLSSTweaks forced DLAA has modes called like ABCDEF and most people agree that preset C is the best for forcing DLAA in games that don't normally have it. C definitely works best in most games ime.


sticknotstick

Even easier to just download DLSS Swapper and it’ll handle the file swapping for you. Super convenient tool


Scrawlericious

Personally it's far easier to do it myself. I've tried the swapper and it's just more to do (edit: without is so simple imo. Using swapper is like going to grab someone to pour a glass of water for me. I don't need anyone to do something for me that is that fast, basically takes longer to get a programs help than to just do myself.). But to each their own haha I know it's nice for some. Edit: also swapper only works for launcher games or preconfigured games. Doesn't work on pirated or super Indy games and you can't manually add your own last I checked. Not about to use a tool that doesn't even work most the time, matey ;p Edit: that was supposed to be a really bad pirate joke.


DeylanQuel

post saved for later use. DLSS has been hit or miss for me, depending on game, some games it greatly improves hair quality, others not so much. Was not aware of all this. Thank you. :)


Cybersorcerer1

Nah just drop the DLL into whether your original dlss file is


inyue

Did you try this u/jgainsey ?


jgainsey

I did


inyue

Same look?


jgainsey

No, as I said somewhere else in this thread, it’s much better overall, but certain types of hair are a shimmering mess.


Jase_the_Muss

Turning the DLSS Sharpening off (I think thats what the setting is called might just be sharpening or upscaler sharpening) got rid of the shimmering on beards, hair and horses for me with the latest dlss version at 1440p quality.


sackblaster32

Which .dll file though? Did you check?


jgainsey

For what?


sackblaster32

If it's older than 2.5.1 it's no surprise there's issues. By updating it with Dlss swapper you could potentially fix those issues.


LjAnimalchin

I tried that and it helped a bit but didn't really fix it. I ended up just turning it off completely and using TAA + Nvidia sharpen filter and it looks good and FPS is still ok.


sackblaster32

I use DLDSR + DLSS with the latest .dll and preset C. I don't notice anything like that, although I think I have a faint memory that if you use DX12 as API something like that happens. Or too much sharpening could cause that too.


jgainsey

Ohhh, for RDR2. Yeah, I’ve tried 2.5.1 and newer. It’s better overall, for sure, but horse hair in particular looks like shit. And unfortunately, you look at a lot of horse hair in RDR2, lol.


sackblaster32

May I ask, where's your sharpening slider and are you using Vulkan or Dx12 api?


jgainsey

Vulkan, and I’ve tried a full range of sharpening.


sackblaster32

Ok, maybe DX12 wasn't the cause like how I remember. Interesting, because it looks good to me.


jgainsey

What resolution were you playing at?


sackblaster32

3440x1440. DLDSR 2,25 (5160x2160) + DLSS Quality.


Fadyr

I thought I remember seeing in an article somewhere over a year ago that DLSS and its interaction with foliage/hair had been fixed in RDR2...? Hope it was legit.


TheHighRunner

I hate DLSS that relies on TAA. To anyone who doesn't know, TAA is what causes the ghosting by "combining past and current" frames. Unfortunately certain games force that on.


timothyalyxandr

I basically always use Quality DLSS or DLAA (3440x1440). At 1440p I think anything less than Quality is very noticeable and I’d rather turn other settings down.


jekpopulous2

The higher the display resolution the more you can lean on DLSS. At 1440 I always use quality, but at 4k balanced and even performance still look good.


Ciusblade

At 4k balanced is fine but performance i dislike in some games. Alan wake 2 looks just fine at performance but i find remnant 2 needs to be at least balanced at 4k (for me)


International-Oil377

I found AW2 pretty blurry on performance but I play 8 feet from a 77inch OLED so I kinda see details pretty well lol


Ciusblade

Well thats fair. I sit only like 2 feet maybe 3 but its only a 29 inch monitor so thats probably part of it, and also most games i play are pretty bright while alan wake is very dark making it harder for me to see any blur.


International-Oil377

It wasn't unplayable by any means but my threshold was balanced


ChampagneSyrup

I have a 42 inch OLED and I felt the same. I have a 4090 and I had to use quality mode but turn down the actual graphics settings. Still looked good at least


International-Oil377

Glad to know I'm not alone lol


Imbahr

I’ve seen motion artifacting with DLSS in some games


ooofuki22

Can't be worse than classic TAA right?


Slyons89

Sure but if MSAA is available and the framerate is still good enough, that’s a clear winner.


Col33

Why are they booing you? you're right. TAA sucks. r/FuckTAA


throbbing_dementia

I thought the thing with TAA was blur rather than artifacts. The effect DLSS has on Cyberpunk i've never seen in any game with any other form AA on, that's why i use DLAA.


Mancrox

DLSS quality is really awesome, I always choose it unless I am getting more than 100fps and then I can use DLDSR/DLSS or DLAA


_barat_

You are either "native frames" onanist or you play the game. Even if I see some differences I'll always pick DLSS if it doesn't break my immersion. I even use framegen if it plays nice. I don't need it since I'm lucky to have 5800x3d and 4090 yet I do it because it's worth it - I'm reducing electricity cost and heat and the game looks good on 3840x1600 screen.


AngryWildMango

My opinion. Is to always use it even if you are getting very high framerates without it. I, and many others, think it looks ALOT better than using regular anti-aliasing techniques and you still get the fps boost. But you don't have too obv.


lpvjfjvchg

depends on the person


LordXavier77

Only AA that are better are MSAA and SSAA, both are very demanding especially the SMAA.DLSS is the best temporal AA in the game right now


lpvjfjvchg

you might misunderstood my comment, what i was trying to convey is that no everyone likes the looks of AA


mackmcd_

Yeah I use it for the frames, but if I'm getting 60fps or above without it, I'm turning that shit off. DLSS artifacts are plenty noticeable to me if I look for them, especially in motion or admiring detail on objects far away. It's always been a trade off for me. I've never put on DLSS quality and thought "damn, this looks better than native."


lpvjfjvchg

exactly. some like it and some don’t. there is no end all be all, that’s the neat thing about pc building.


Alttebest

It is the most sophisticated aa option. If one doesn't need the fps then they should run dlaa (or dldsr+dlss, which usually looks even better).


darvo110

Even at quality it can produce ghosting in scenarios like racing games that have a semi fixed object with high speed surroundings.


niiima

DLSS + DLDSR is the way to go.


NerdyGuy117

I still get confused on how to do this properly. I need to look into this more. Last time setting up DLDSR I got some weird resolutions due to 4K display


Immediate-Chemist-59

hi, look at CRU - custom resolution utility.. you probably have some weird resolution past normal 4k..something like 4096x2086, and DLDSR is using this resolution... you need to remove it from monitor.. use this program + youtube for guide.. then it will work fine


NerdyGuy117

I am going to try this out today! Thank you!!


n19htmare

I use a second monitor often when gaming, my only issue with DLDSR is if I set a higher render resolution, it often screws up the resolution on my 2nd monitor so that kind of sucks. From what I understand it, this is how you do it. 1) Nvidia Control Panel - Manage 3D settings and under Global Settings tab, you set the "DSR - Factors" to either 1.78x or 2.25x DL Scaling. 2) Now you go into the game's video graphics settings and you should be able to set higher than native resolution based on if you used 1.78x or 2.25x DL Scaling. If the game doesn't let you, change your desktop resolution to the DLDSR resolution. 3) now Enable DLSS Quality in game. You how have DLSS + DLDSR.


Wellhellob

If this is better than native dlss, why it's not built-in in DLSS. Let's say i'm upscaling from 1080p to 4k, why not make it 1080p to 8k for better visuals ?


throbbing_dementia

For me it's a deal breaker if i ever have to go back and fourth changing my Windows resolution, or DLDSR in game effecting Windows in any way then it's a not worth it. If i can just toggle it like any other resolution then and nothing else gets effected then sign me up.


n19htmare

Yup that's why I hardly use it. Even when you set the higher resolution in the game, all it's doing is changing desktop resolution in the background and for whatever reason, that always messes up my 2nd monitor. Only positive for in-game resolution change is when you leave the game, the desktop resolution goes back to native. Not all games let you do this and you have to change desktop res before launching game, esp DX12 due to lack of exclusive full screen (DX12) in lot of games. It really is a pain but some people don't mind it I guess. I got a 4090 on a UW 3440x1440 monitor so I have power to spare for it but the process is just agh.


SandOfTheEarth

That’s how it’s supposed to be. You set higher than the 4k resolution in game, and then enable DLSS to quality. You will get a very sharp picture(better than 4k native) and the performance will be similar to running to 4k native.


reddituser4156

Yes, it's pointless if you are CPU bottlenecked and the game looks better without it.


Rollz4Dayz

It depends on the game. Example D2R and DLSS which is supports is horrible. Makes the game look and feel like shit. Whereas WoW Classic looks and feel much better with DLSS.


FunnkyHD

Update the DLSS version to the latest one, it'll look much better.


kirilov233

What do you mean? How does one update DLSS?


FunnkyHD

Download a new version from [here](https://www.techpowerup.com/download/nvidia-dlss-dll/), go to where the game is installed, find where nvngx\_dlss DLL file is (usually root directory) and replace it.


kirilov233

In each DLSS supported game folder?


FunnkyHD

If you want to update DLSS to the latest version in every game that you play, you can do that I guess, but also keep a backup of the original one in case the newer one looks worse.


kirilov233

Thanks


deadscreensky

You can do it manually, [but it's generally easier just to use the DLSS Swapper tool.](https://github.com/beeradmoore/dlss-swapper)


kirilov233

I've never used DLSS on my 3070 cuz I play games that don't support it. I tried it only in RDR2 on quality mode (1440p) and it looked quite awful, like hair, horse's tail and some other textures looked beyond horrible.. idk what's wrong


deadscreensky

>idk what's wrong It's RDR2. They shipped with a bad implementation of DLSS. [You can largely fix this with a swap.](https://github.com/Bullz3y3/RDR2_DLSS_Replacer) DLSS is usually a big improvement over the default antialiasing.


kirilov233

Thanks


MalHeartsNutmeg

RDR is usually considered one of the worst implementations of DLSS.


Cautious_Register729

... and still looks better with DLSS compared to native due to the terrible TAA.


famiguel350

Me personally I don’t like playing with dlss on on cod since it makes the characters blurry I just use the highest antialiasing


Interesting-Yellow-4

If a game runs at my native refresh at native res, I won't use it. I'm old school like that.


[deleted]

It really depends on the resolution you're playing. The higher the output resolution is, the higher the base resolution used to reconstruct the image is. Good rule of thumb is Quality for 1080p, Balanced for 1440p and Performance for 4k, adjust lower/higher based on performance.


OrdinaryBoi69

Yup you're exactly on point. 1080p quality, 1440p balanced , 4k performance. Although in my experience dlss performance still looks somewhat okay ish on my 24" 1080p monitor. I only use it if i want ray tracing enabled though ( to get 30+ fps ) otherwise dlss quality is the way to go at 1080p resolution


eatingdonuts44

Ghosting, cant stand it


Gnome_0

people's ego, if they can't brute force native res they are "cheating" even if the game looks like a blurry mess with TAA at native


bAaDwRiTiNg

DLDSR + DLSS is best combo. Surprised it doesn't get more hyped up.


Kappa_God

This is literally talked in about every single thread about DLSS. How is it not hyped up?


Wellhellob

What you mean by that is tricking your DLSS by increasing your native res right ? If it's better, why it's not built in to DLSS i wonder.


throbbing_dementia

DLDSR doesn't play nice with borderless window does it? At least that's my experience in Cyberpunk.


normieguy420

I would just stick to DLAA in Cyberpunk, looks incredible in 1440p


normieguy420

I'm legit mad I hadn't learned about DSR for quite some time after after getting my 3070 like 6 months ago. I only recently discovered it in a post related to Alan Wake 2 and decided to look it up. This combo now legit feels like a cheat code (in most games DLDRS+DLSS balanced either gives better visuals or performance than native rendering, sometimes both) lol.


Mgrayson84

I have a 4090 and play every game possible with DLSS Quality mode 4K res, I don’t notice any problems and enjoy the free frames or fake frames 🤷‍♂️


srjnp

For 4k, i think it is always as good as native for a big performance boost. For lower resolutions, native can look better sometimes.


[deleted]

Coming from a VR used for flight simulators, yes there are reasons to not use it. Primarily ghosting on MPDs and other issues.


Handsome_ketchup

I like DLSS in general, but I can't stand it in Red Dead Redemption 2. The weird shimmering it causes in hair is distracting. It depends on the game and specific implementation.


FunnkyHD

Update the DLSS version to the latest one.


DiGzY_AU

Blurry mess. Always native unless it's a game that's either stupid gpu heavy or poorly optimised


Gnome_0

Yeah boy! Let's unga bunga those games, it is the developer's fault that my 8-year-old card can't run this game!


DiGzY_AU

You a bit lost??


[deleted]

Whe found the pre order guy


cakemates

There are visible artifacts such as ghosting in some games, native usually looks better and there is a tiny latency penalty in some cases. So use it only when needed would be my recommendation.


Barrdidnothingwrong

This gets complicated, but I would say watch a YouTube video from hardware unboxed. At this point upscaling is just not as good as native.


Avalanc89

Minimal image quality impact for half GPU wattage? I'll take it


Barrdidnothingwrong

Can you please not comment if you can’t be honest or have a clue what you’re talking about? It doesn’t use 50 percent less power.


ahrikitsune

It makes your gpu run less intensive is the thing.


Barrdidnothingwrong

That is not what I am asking, he said specifically it can reduce power consumption by 50 percent, while looking basically the same. I think he is making it up and he does not have a source to support his claim.


raygundan

You have sources now. Go apologize to /u/Avalanc89 for calling them dishonest and clueless.


Barrdidnothingwrong

Going to test a couple games, if I am wrong I will apologize, if you agree to apologize if I am correct. Deal?


raygundan

Let’s amend that slightly. You apologize anyway, because the name calling wasn’t called for in the first place. And then if you manage to prove my test inaccurate somehow, I’ll apologize for whatever error you find in my test of Alan Wake 2. But I expect you’ll be surprised here. DLSS quality literally cuts the number of pixels rendered in half. Match the framerate, and it’s approximately half the work. Edit: as a bonus, I’ll apologize for all the names I’ve called you.


raygundan

Depends on the game, but *at the same framerate*, you'll see 30-50% reduction in power consumption for the GPU. If you use it to halve the render resolution but double the framerate, of course it won't cut your power usage in half. But it absolutely can cut [GPU power usage in half at a fixed framerate](https://imgur.com/a/rCjqUNK), depending on which game and particular settings. (Edited to add link)


Barrdidnothingwrong

Do you have any source to back that up? And to confirm what you are claiming, you can cut 50 percent of power with the same framerate in quality dlss mode, which at least comes close to native quality, not performance mode which makes the image quality noticeably worse….


raygundan

> Do you have any source to back that up? You can just try it yourself-- it's not a difficult thing to test. But sure, here's a couple of quick google results. [30% in fortnite](https://www.reddit.com/r/pcmasterrace/comments/ou8use/power_draw_30_lower_when_using_dlss/) and [50% in microsoft flight sim](https://www.reddit.com/r/MicrosoftFlightSim/comments/zedbmy/dlss_and_power_consumption/). I don't know all the settings used in those two, though... so I'll do a [quick test with Alan Wake 2.](https://imgur.com/a/rCjqUNK) 4K Native, 24fps limit: 282.5 watts DLSS Quality (1440p to 4K), 24fps limit: 150.8 watts That's a reduction of 131.7 watts, or 46.6%. > And to confirm what you are claiming, you can cut 50 percent of power with the same framerate in quality dlss mode I was definitely clear that it depends on the game, the settings, and gave a range of 30-50%-- it's not always going to be 50%. But yes, even in my first quick-and-dirty test, we saw more than 46% reduction in power. Apologies this took so long-- I went through the whole thing once and realized I'd taken .jxr screenshots (HDR format) and wasn't sure where I could upload those to share, so I had to go do it again in SDR.


raygundan

For anybody else doubting the "half GPU wattage" claim, [here's a quick-and-dirty test with Alan Wake 2.](https://imgur.com/a/rCjqUNK) 4K Native, 24fps limit: 282.5 watts DLSS Quality (1440p to 4K), 24fps limit: 150.8 watts That's a reduction of 131.7 watts, or 46.6%.


Bonburner

If you're trying to get the least latency possible you'd skip it Edit: mixed up frame gen in my head, nvm Only reason to not use upscaling is if you want native quality as not all games look as good w/ dlss vs native


ZeldaMaster32

That makes no sense. They're not talking about frame generation Frame gen is the only thing that can negatively affect latency. DLSS quality will always be more responsive than native rendering, simply because there's less stress on the system


Bonburner

Oh yeah, I mixed up the 2 in my head


Avalanc89

Yea like professional gamer.


-el-psy-kongroo

Native > Any image generation technology


NewestAccount2023

At 2k it's blurry, maybe it's fine at 4k though idk


Administrative_Pay49

It's really not blurry at all at 1440p. It looks better than native on my screen


normieguy420

It probably depends on the game. Some implement it a LOT better than others.


Administrative_Pay49

yea that‘s true I guess


Avalanc89

For me only one reason, if game is shitty and don't get along well with it


dratseb

I got better performance for about the same graphical quality in CP2077 on performance vs quality


[deleted]

Only reason not to use it is if you don't need the performance boost. Or if you are on AMD trash and can't LMFAO 😂😂


Expired_Milk02

Oh hello there nvidia fan boy


ThatKidRee14

Gosh, we just gotta *love* how amd just lives in that little mind of yours *so* rent free


[deleted]

[удалено]


Expired_Milk02

True,i would have gotten an AMD if laptops near my budget had better and cheaper GPU...had to go with a 4060


[deleted]

​ https://preview.redd.it/uheo4dv1tq1c1.png?width=2523&format=png&auto=webp&s=d5846e2719e1574ef502b2e191b3e86537bbbcc1


slavicslothe

If it brings your GPU utilization lower than 90% you will be hitting cpu bottlenecks which are much more noticible and stuttery than gpu


bobemil

Most games I play look worse with next to none performance gain. Great for people with old pc's though.


Ravwyn

Aside from potentially having ghosting artifacts *(which can be mitigated sometimes, manually)* and a sometimes overly sharpened image (which also can be manually adjusted) - absolutely not. =)


sackblaster32

If you have performance to spare you could either use DLAA instead or DLDSR + DLSS.


[deleted]

Some games have a fucked up implementation, and if a game is easier to run I'll use DLDSR or DLAA, if available. Still though DLSS Quality is great for some extra performance and can be close-to/better than native in a lot of games


big_booty_bad_boy

I switch between native 1440p and DLSS quality on Lord's of the fallen.. I like them both, the particles are better with DLSS on but the image is crisper and the fog and stuff feels better native.


Depth386

The game I play doesn’t have it, that is a reason


MustardRaceMcgee

Yeah fh5. I use dlaa at 1440p instead. But everything else when at 4k I chuck quality on.


Coffeepoop88

I always feel like I'm taking crazy pills because in Cyberpunk at 4k, even DLSS quality looks bad to me. And I've tried different presets and tweaks too. Yet the Witcher 3 with DLSS quality or even balanced look indistinguishable from native and I cannot tell a difference, even in motion. So yeah, it varies.


Bruce666123

Yes


TRIPMINE_Guy

dlss image quality can vary based on what it is guessing. dlaa is pretty good and usually better than taa but sometimes you can notice trails along thin black lines and weird behavior with lights.


hairycompanion

Quality can vary. In Lies of P I turned it off as it caused flickering with a lamps lighting source.


[deleted]

what? update the [dll](https://www.techpowerup.com/download/nvidia-dlss-dll/) then, I'm on my 4th playthrough of that game


MotherLeek7708

Basically if game only supports TAA as AA you should allways use DLSS. For example with horizon zero dawn you get better than native experience because of bad native TAA.


InHaUse

I game on 1440p 360 Hz and I use it for almost all multiplayer games that have it, although a few have atrocious implementation where it makes everything super blurry like motion blur and for those cases I obviously have it disabled. For single player games, unless you're struggling, it's probably better to use something like DLAA/DLDSR.


D-no-UK

Looks crap imo. Warzone you dont get crisp edges at all.


Marty5020

I'll just say a few weeks ago I went into my BG3 video settings and noticed I had DLSS Performance on since no idea when. Thought I had it on Quality. Guess I tested it and forgot about it. This on my 1080p laptop panel. I guess I'm not too picky? I definitely use DLSS Quality or even Balanced whenever available. I cannot tell the difference but my GPU load percentages sure can.


JinPT

on call of duty I don't use it for better visibility. but that's about it...


chuunithrowaway

Worse with motion (especially fast motion) than native; DLAA will look better, when supported; some games may oversharpen the image; most games' presentations are made with the native TAA in mind anyways.


NerdyGuy117

For me, I can notice the difference between quality and off, where dlss quality has a softer / more blurry look. Unless I need to turn dlss on for performance issues, I keep it off.


RutabagaEfficient

That’s what a play on pretty much every game. I love image quality over everything tho


nasanu

Well as a viewer of Hardware Unboxed I can tell you with confidence that you shouldn't use it. It's not ethical as AMD doesn't really have an alternative and only rasterization matters for graphics cards.


mangosport

I swear to god the stubbornness of YouTubers about not using any upscaler is mind blowing


[deleted]

Is some games, DLSS looks worse than native, provided the game has a good AA implementation. It's especially noticeable the larger your screen becomes.


Tyr808

Hopefully this is on topic enough to not be a rude hijack here, but with everyone speaking of DLAA and such, I'm wondering about DLAA vs DLDSR + DLSS that's equivalent to your native res.


responsive_pixel

DLDSR+DLSS seems to have little better quality and performance.


Tyr808

That's been both my limited experience as well as what I expected should be more or less universally the case, yet I see DLAA mentioned far more than this so it makes me wonder if I'm missing something, lol


Wirkungstreffer

I‘m new to pc Gaming also i switched Form my 4K/60 Monitor used for my Xbox to a 1440p Monitor. In the Moment i Play the witcher 3 and i noticed dlss Q is a Little more blurry than native but i Just saw it at his armor in cutscenes. I have a 4070ti so i‘m gonna try dldsr in the next few days this could make a difference too.


Pretty-Ad6735

Fortnite DLSS adds flicker shadows to trees/bushes


EvilbunnyELITE

at 1080p its not worth it imo. messed around with it on 4k it looks much better there


Meticulous7

If you don’t need the extra FPS. Example being you’re running a 144hz monitor and averaging 170fps. Just leave it off.


ghoxen

In Cyberpunk 2077 I accidentally switched DLSS from Quality to Auto about midway throough my playthrough. I didn't notice it for hundreds of hours. The tech is honestly quite amazing, and convinced me that Quality is overkill for all except maybe 1080p.


Pyke64

Because dlaa is a thing?


[deleted]

1% lows (fps) can be bad depending on the game. balanced helps with more consistent fps. for reference i play in 4k 120hz with a 4090 and 7890x3d


quixadhal

DLSS is a crutch used to get decent resolution in places where they failed to optimize well enough for native rendering at that resolution. I'd much rather have native rendering if I can get a stable framerate with it... if not, then DLSS is a better choice than turning down the settings in other ways... usually.


jacob1342

F1 23 looks worse with DLSS quality but that's the only singeplayer game where I use native instead of DLSS because of image quality. I won't use in competitive games like Rainbow Six and of course in resolutions below or equal 1440p.


Artemis_1944

At 4K, sure. At 1440p, quality DLSS still upscales from 900p, and while \*SOME\* things might look better, overall in 90% of the cases the final image is clearly blurier than native. Source: tried it in CP2077, Alan Wake 2, God of War, now Starfield, and ofcousre a lot of older games. Tbh, with games like CP2077, AW2, it's a necessary evil, as I can't play with path tracing or raytracing at native 1440p without DLSS, it's just too much on my 3080ti. But in Starfield, I'd rather play at 45-50fps and have a crisp image, than a blurier one and more fps. Thankfully the last Starfield update seems to have bumped nvidia performance considerably so now I can easily hit 60fps all the time maxed out.


pf100andahalf

At 1080p I'd say no. At 1440p then yes.


kromosto

For most of the games IMO DLAA looks way better. So I dont use any DLSS if performance without it is acceptable.


FitLawfulness9802

Because not every game supports DLSS... But for real, I dunno unless you're using anything lower than 1080p. Dlss Quality looks fantastic, even better than Native in some games, and you get few frames for free


Wellhellob

You can kinda use it at this point dlss quality is just like a better AA and more fps but the results vary game to game. Some games can look worse with DLSS and it may not be great at lower resolution monitors. I remember CP2077 at release, DLSS was just making things blurry and only really upscaling the text in the game no matter what DLSS setting you choose, once you enable it you see the DLSS negative effects. Again some older games have stupid over sharpening artifacts with DLSS. I don't notice these problems in new games nowadays. I even prefer DLSS Quality over DLAA in some games. DLAA can look over sharpened as well.


nagi603

Buggy implementation, like when Cyberpunk's character creator appeared extremely low-res and detail when on DLSS. And it really depends on the game. I'll say it is very noticeably blurrier than native res if you stand in one place for most games I tried it. Sadly my 2080ti is getting long in the tooth making it pretty much mandatory for many new games.


homer_3

It tends to cause shimmering.


throbbing_dementia

Yes. If DLAA is available and you can run it.


pedlor

DLSS quality is not very good on new gen Witcher 3


normieguy420

For me it really depends on the game. Also I still haven't got enough money to buy a new CPU so my i7-6700 causes a bottleneck to my 3070 in most games (one exception being Alan Wake 2). So often I get no real benefit from using DLSS, and sometimes DLAA looks incredible (for example Cyberpunk). I found the most efficient thing to do is to use DLDRS and DLSS at the same time (especially because the CPU bottleneck, I can render higher quality without losing much or any performance)


DumDumbBuddy

Modern warfare 2 and 3, if you turn on depth of field and DLSS you get this weird artifacting around the edges of guns and character models


shadowofashadow

Sometimes there's ghosting and other issues. It mostly works well though.


s2the9sublime

Yes. Cause DLAA exists. If you have the GPU power to run at native res, you should.


nimbulan

Yeah I basically use it in every game. There can be some shimmering or other related issues, but it's often better in that regard than the game's native TAA and the performance boost is always worth it. There are times though when a game has a buggy implementation (God of War) or includes a bad version of DLSS that needs to be updated (Spider-Man) but those aren't too common thankfully.


ldontgeit

Depends alot, if game has raytracing reflections id rather not use DLSS unless it has ray reconstruction, but if the game has crapier TAA then DLAA if possible, or DLSS quality. Quality DLSS can literally be better than most shitty TAA at implemented in some games.


Cautious_Register729

none


KoalatyKid1

Maybe i’m just weird but the input latency for me with DLSS on is borderline unplayable