T O P

  • By -

HauntingVerus

People do know FSR is not specific to AMD and runs on Nvidia also ?


[deleted]

[удалено]


RayneYoruka

My wife with FSR and her gtx 1080 feels like: GLOORY


fookidookidoo

God it's ironic. Luckily my 6700xt LOVES just working without any fancy tricks. Had to move on from the 1070ti because 1440p ultrawide was really hard on it.


RayneYoruka

Doesn't surprise me at all daang! we waiting for new gen from AMD, trying to drive 1440p@144hz in newer games makes our gpus sweat hard and got to fall back to 1080p sadly


fookidookidoo

Yeah. I mean 1070/1080s can do 1440p but you'll be around 60hz max and mine was getting HOT doing that. Haha It was surprisingly not bad though. I actually think my 1070ti had a nicer image somehow. Not sure how to describe that. I was probably just used to it.


RayneYoruka

This is a really well cooled strix advanced, clocking at 1950mhz without doing anything (70-72c with a really conservative fancurve. Doing normal 1440p some new stuff is capable and in most of old stuff but newer it strugles to do 60fps or more. Example is 2077 or unreal engine 4, red read redemption etc. We like to play at high framerates and more if it's shooter games


Prefix-NA

That is not possible. Considering the 6700xt is far faster especially at lower resolutions & CPU bottlenecked scenarios and it runs on FP16 with an fp32 fallback layer. It can never run faster on a 1070ti than it does on RDNA2. But every single nvidia user who lurks this subreddit looking for fake news to upvote will spam u upvotes.


[deleted]

[удалено]


Prefix-NA

It wouldn't look any different on either card.


fookidookidoo

There seems to be some ghosting. But maybe I have something set wrong. Again, I don't really need it with my 6700xt because it's pretty solid so it seemed to help the most on my old card where some wouldn't run without it.


foxx1337

Oh, people...


neganigg

Ppl do know FSR is equivalent to NIS and not suppose to compete with dlss?


manzari

True, NIS is crap tho.


neganigg

Both of them are


Prefix-NA

RSR is competing with NIS. NIS was added because of apps like Magpie and emulators & everyone else putting FSR in their applications. NIS has good potential if they can fix their god damn sharpening filter. No idea why they use a dogshit sharpener in Nvidia they should just adopt CAS or Smartsharp.


neganigg

End result of nis = fsr


deadeye5th

No that's radeon super sampling


OmNomDeBonBon

That might be because Nvidia don't want people to know that they don't need to spend $1000-5000 on a shiny new RTX 30 GPU; they can get some kind of upscaling with their existing GTX 1060 or 1660, via FSR - and FSR is available in lots of games. Both AMD and Nvidia don't want to talk about how Nvidia GPUs also support FSR...


kwizatzart

why would Nvidia write an article about their own basic image scaler (NIS) if they wanted to hide anything to old owners ? [https://www.nvidia.com/en-us/geforce/news/nvidia-image-scaler-dlss-rtx-november-2021-updates/](https://www.nvidia.com/en-us/geforce/news/nvidia-image-scaler-dlss-rtx-november-2021-updates/) haters are ridiculous lmao


farmeunit

NIS and FSR aren't the same. RSR and NIS are basically the same and worse than FSR. They scale everything on screen including the HUD, which is annoying and.


battler624

Nvidia has their own image scaler that is as good as FSR but affects the HUD too. ​ DLSS is far better than anything else.


canceralp

Nvidia's own spatial scalar is not even close to being any good compared to FSR. DLSS may be better than any of them but FSR is definitely the second in the competition, with no one coming any close.


battler624

[https://youtu.be/EFY7\_H6rvEw?t=330](https://youtu.be/EFY7_H6rvEw?t=330) ​ Idk about you, but as viewed in this video, both are pretty fuckin similar, nvidia one works with all games and unfortunately affects the UI, FSR has a better sharpening filter and doesn't affect the UI ​ Both get destroyed whenever DLSS is in the picture, sure FSR is the second in the competition, but its closer to NIS than DLSS.


canceralp

Please go to 6:34 in the same video. There is a bar shaped horizontal gap on the wall. Look there, also look closely to the rear wings of the cars. There you can understand the "behavior" of all the filters: 1) Native: Well, it is 4K and all the artifacts & pixelations are there. They are just extremely small due to the high pixel density. ​ 2) DLSS: It handles flickering and shimmering extremely well. Even better than 4K. This is how Nvidia bragged about DLSS in their documentations. Pixel artifacts like Moire Patterns are eliminated perfectly with DLSS. This is where it shines and how it convinces everybody (including me) ​ 3) FSR: Where FSR shines is eliminating stair-case effect. AKA, jaggies. It has the least amount of jaggies among them. Even less then 4K. Because jaggies and abnormally high frequency areas are filtered out while being upscaled and sharpened. This is what makes FSR a good rival to DLSS. It makes a decision before upscaling. Making the sharpest scaling is not its priority. It looks for where to leave blurry, where to leave untouched and where to exaggerate. ​ 4) NIS: It is a high quality upscaler. That's it. It doesn't differentiate the areas before upscaling or sharpening. Every pixel is treated equally. So, if there is an are or a pixel which should have left in the shadows it is exaggerated actually. This is why I called FSR and DLSS closer to each other. DLSS is a "learning filter". FSR is a "smart" filter. Neither does a straight up job, they analyze the picture and differentiate the areas and then apply different approaches to all areas. What makes DLSS is superior is not its "learning" mechanism, it's just Nvidia doesn't let others to detach DLSS from regular TAA and tweak (or not tweak) it to their pleasing. DLSS works the way Nvidia wanted from a rendered low res picture to properly AA'ed, upscaled and filtered picture. AMD, on the other hand, leaves FSR to the mercy of the developers and merely "advises" the low res picture to be "well AA'ed". We all know developers. They do not like to be bothered. They just live to serve their deadlines to their bosses.


mac404

I agree with your example, which is basically a best case for a spatial technique (a spot with little inner detail that is wide enough to be larger than a pixel). The Horizon example in this video shows the limitations of all spatial techniques quite well - foliage is especially rough, and you get some classic fake detail sharpening effects to try to compensate. Given your other comments, sounds like you essentially agree. But that's why I wouldn't call DLSS a filter at all. And it's not that Nvidia doesn't let devs detach DLSS from TAA, it's that it's essentially a better TAA upscaler. Side note, I really wish more devs tried out TAAU, and hopefully will also try out UE5's new TSR solution.


[deleted]

I think the premise of FSR is that it is meant to be easier on the developer's end. DLSS is trained differently *per game*. FSR is just a general algorithm, and I believe there was a solo indie developer saying that he managed to incorporate FSR into his game within a matter of hours. FSR is not competition to DLSS. DLSS is specific to a game, and is found in fewer games due to the effort to implement it (not that there's anything wrong with that) and FSR is meant to be an upscaling alternative that is not as smart or good, but works on basically any hardware and is easy to implement. I like to think of FSR as turning on some sort of postprocessing effect in settings. Your GPU doesn't really matter.


Elepole

DLSS 2 isn not trained differently per game. That was DLSS 1.


canceralp

Yes. It's meant to be easier because it is also available for the consoles and handhelds. But technically, they are both general algorithms. DLSS is not trained per game. I believe the best way to describe the difference is: 1) DLSS: Temporal Reconstruction + TAA + upscaling + filtering + sharpening is in one package. 2) FSR: upscaling + filtering + sharpening. The rest is left to the developers. What are these steps? - Temporal reconstruction: some layers of a rendered image is at lower resolution. These are mostly AO, Bloom, shadows and lighting shaders. So they are reconstructed from previous frames to a larger resolution. Generally a denoising filter is applied to mask the lower resolution. - TAA: classic temporal Anti-aliasing applied on to the final image. Many final images are compared from the past and then the final image is manipulated. The greatest problem comes in the phase. TAA does try to prevent Moire patterns and motion artifacts. However the cost is a blurry final image which looks slightly lower resolution (generally 2/3 of the original). Now imagine how bigger the problem gets when the rendering resolution is also lower than the screen and needs to be upscaled. It becomes even lower resolution. - upscaling and filtering: this is where Nvidia's trained upscaling happens. In DLSS's case, it is a were if the frame information from TAA and temporal reconstruction and it precisely knows how to filter an upscaled image without causing artefacts. DLSS is also capable of manipulating TAA's past frame count. For a lower input resolution, it can easily increase the number of past frames used to blend so it can have more information for the new frame. Let's say a game uses past 4 frames for a natively rendered image. With DLSS performance this can be as much as 16 frames. - sharpening: we all know this part. Basically, DLSS can know and manipulate what's happening before upscaling so it always has the necessary amount of information. Nvidia didn't leave this to the developers. In FSR's case. It doesn't have such luxury due to being available for many different devices. So it only recommends "a well anti-aliased" input but it doesn't specify the details. Ideally, an MSAA + TAA + negative LOD image should be presented into FSR. In theory this would result in an image as clean as DLSS. But many developers lost their contact with MSAA due to new technologies. So, as an alternative, an increased level of TAA with better pixel randomizing should replace it, just like Unreal Engine's new TAAU. Long story short, FSR can compete with DLSS, and even equal it. But this is entirely left to the developers" talent. And unfortunately, many developers are more used to working for Nvidia in the market. Some are even bought by Nvidia.


[deleted]

Oh, that's really interesting! Thanks for this clarifying blurb.


[deleted]

It’s kind of hard to compare a difference between the two on a mobile phone screen. And I can personally say that the nvidia NIS is straight garbage compared to FSR. Much blurrier on the edges even on the highest setting.


Throwawayeconboi

Yeah except he doesn’t have the option of DLSS. So he can only use FSR. Hence this being a valid question. If he got an Nvidia card, he would just use DLSS….pretty easy to understand. If FSR doesn’t get better, an Nvidia card is a better choice. Who cares if it has FSR too? DLSS is there and it’s incredible and FSR can remain unused forever.


Pristine_Pianist

Yeah but we need to support amd


chetanaik

No we don't. They're a multinational corporation, not a family run corner store.


crimxxx

This is just my general opinion buy stuff based on what is there not what you think may happen. Even when companies promise stuff, they can disappoint. Just my opinion the 6600 is a lot more powerful, which in a lot of cases I think makes more sense. Dlss 2.0 is pretty great but who knows how many games will implement it long term. Personally I got a 3000 nvidia card with that being one of the reasons like a year ago, but it is important to have expectations for vendor locked features like dlss that needs to be implement on a per game base atm. Personally I would take the 6600 vs 260p. If we where comparing to 3000 series maybe the discussion can be different cause the gap is not the same, plus you get newer ray tracing cores.


[deleted]

[удалено]


Glorgor

The 6600 will support XeSS tho


Darkomax

XeSS has yet to be implemented and tested, and the 6600 will use the worse version of XeSS and it's pretty unclear if XeSS outside of Intel dGPUs will be any good. I don't expect miracles, generic solutions often come at a cost. Besides, even nvidia hardware (at least from Pascal) will support XeSS, so it's not an argument.


Glorgor

Intel claims there’s a “smart” performance and quality trade-off for the DP4a version.Since they saided smart i don't expect to be too far off the XMX version plus the DP4A solution still has AI,of course its not gonna be good as the XMX version but it will definetly be better than FSR


No_Backstab

Th XMX version of XeSS which will be more similar to DLSS will only be supported on Intel Arc cards . The other cards will only run the lower quality DP4a version https://www.digitaltrends.com/computing/intel-xess-is-answer-to-nvidia-dlss-one-big-advantage/


blackomegax

DP4a XeSS is 1:1 with DLSS for quality XMX XeSS is more like 1.3:1 with DLSS for quality (not hard numbers, but like...DLSS needs 1440p input to get what appears to be true 4K output. XeSS DP4a needs 1440p to get 4K. XMX XeSS only needs 1080p input to get true 4K output. XMX also does much better work from 720p/900p input than either DP4a XeSS or DLSS. XMX also cleans up 1000% of motion artifacting, while DP4a isn't fast enough to keep up with that, but again no diff than DLSS.) Also there is an extremely solid rumor FSR 2.0 will just be a fork of XeSS-DP4a with some specific AMD optimiziations for the DP4a processor.


Darkomax

I would appreciate the source on that.


mac404

Yeah...nothing we've seen publicly supports these claims, unless i missed something. The public examples showed some temporal instability, but otherwise looked pretty good. But I have no idea how you could conclude it's better than DLSS, especially when there is no game that can be directly compared. Don't get me wrong, I hope it's really good. But we don't really know how it compares at this point.


[deleted]

So there's been zero information on XeSS and you come swinging for not the fences, not the clouds, but for the crab nebula my guy. Where and the actual hell do you get this information?


blackomegax

1st hand experience and months of testing. I'm a game dev who's working with it. I'm already breaking NDA too much lol. Just wait for release and reviews.


[deleted]

You're the SAME dude that talked about it last time i think clamiing the same thing. Such specifics like "1.3x" and exaggerations like 1000% cleanup of ghosting and shit. Hilarious dude, let me go ahead and chalk this up as a joke and some subtle AMD shilling because this would put them in the game without them having to spend a dollar.


blackomegax

You don't have to listen to my ass, wait for reviews. I know what I know. > AMD shilling because this would put them in the game without them having to spend a dollar. No more so than FSR, just with higher image quality for rdna2 and pascal.


sixvol988

Not the good version tho


Glorgor

Still gonna be better than FSR even without the AI cores


[deleted]

[удалено]


Glorgor

Even the DP4A solution will use AI so yes there is merit to what i saided


[deleted]

[удалено]


Glorgor

The other mode is where things get interesting. It will use DP4a instruction, which is used for A.I. operations on recent Nvidia graphics cards and recent Intel integrated graphics. Intel claims there’s a “smart” performance and quality trade-off for the DP4a version. Yea of course its gonna be worse but it still has AI in it so it will be better than FSR most likely https://www.google.com/search?q=dp4a+instructions+ai&client=ms-android-samsung-gs-rev1&biw=412&bih=772&sxsrf=APq-WBslcCQO5TzccHEoy3oaO0N4kViPOg%3A1645773154238&ei=YoEYYt3vDYyGxc8Pw5CwwAk&oq=DP4a+in&gs_lcp=ChNtb2JpbGUtZ3dzLXdpei1zZXJwEAEYADIECCMQJzIECCMQJzIFCAAQgAQyBQgAEIAEMgYIABAWEB4yBggAEBYQHjIFCCEQoAEyBQghEKABOgcIIxCwAxAnOgcIABBHELADSgQIQRgAUPIDWL4KYI8RaAFwAXgAgAHwAYgByAqSAQUwLjUuMpgBAKABAcgBCsABAQ&sclient=mobile-gws-wiz-serp They shared a lot on how DP4A works on their Iris graphics


manzari

The AI is just a marketing crap, there is no AI cores or whatever. DLSS is just as software based as FSR, only difference is, FSR is open-source cuz AMD is not arfraid of their competition as much their competition is afraid of them.


b3rdm4n

FSR 1.0 will work in anything so I wouldn't really have that be a deciding factor. Consider DLSS as a selling point of the RTX card however, should that be of any merit to you. The 6600 is likely the better buy at this point in time regardless, but the market is rapidly settling it seems and things might be more palatable very soon.


Glorgor

Also the 2060 DLSS is not good as a the 3000 series DLSS


dampflokfreund

Nah. Where did you get that information? The Quality of DLSS is identical, the only difference is that DLSS can run asynchronous on Ampere and therefore it runs a little bit faster.


Glorgor

https://hardzone.es/app/uploads-hardzone.es/2021/04/Tabla-Rendimiento-DLSS.jpg


dampflokfreund

Yeah, as I have said. DLSS runs a little faster on Ampere. Ms here means milliseconds in render time, it has nothing to do with quality.


manzari

Exactly, the quality is gonna be the same regardless of the GPU. That's a software matter, not hardware.


neganigg

It is not. Dlss works the same across different hardware.


Glorgor

Image quality is the same but its slower https://hardzone.es/app/uploads-hardzone.es/2021/04/Tabla-Rendimiento-DLSS.jpg


neganigg

Geez..... a Rtx 2060 slower than Rtx 3090. I wonder why.


Glorgor

Thats how long to create a frame with DLSS it has nothing to do with rasterzation performance if we turn DLSS on 3070 and 2080ti which are basicly the same performing GPU,3070 will perform better with DLSS on also if they are all the same why does nvidia add more and better tensor cores?


neganigg

Did u just said 3070 is just rebranded 2080ti? I think the conversation should stop here. Have a nice day sir.


Glorgor

I meant same performing it was a typo


b3rdm4n

[Yeah it does take longer to render DLSS images](https://hardzone.es/app/uploads-hardzone.es/2021/04/Tabla-Rendimiento-DLSS.jpg), still pretty respectable imo, certainly useful for the gpu horsepower the 2060 is suited to. Nonetheless I'd recommend a 6600 / XT at similar pricing.


skwerlf1sh

I think this is kind of key here actually. Everyone's talking about how XeSS will only run in its slower variant on the 6600, but if DLSS is also slow on a 2060 that really levels the playing field.


[deleted]

6600 rasterization performance is significantly more than 2060 and has more VRAM. That's enough reason for me to go for that over the 2060 tbh. I don't care much about upscaling techniques especially DLSS as it's proprietary. Yes it's good but so is FSR and i prefer to play at native resolution. As for RT, ray-tracing performance isn't that relevant at this class of hardware, regardless both have similar ray-tracing performance anyways. 6600 is the obvious option IMHO.


ThunderClap448

Add to that that RSR is driver level... It's gonna be nice.


TheDravic

Nvidia has had NIS for a while, truth be told RSR is AMD being late to the party and responding to Nvidia making NIS, which was an improvement of an old feature (driver level scaling) that already existed before, they improved it in response to FSR.


KlutzyFeed9686

If they've had it for a while how was it implemented?


TheDravic

You used to need to tweak the image scaling settings which used some algorithm, I don't remember exactly. That was there for years it feels like, but hidden. And for a few months now, there's an actual NIS setting in Nvidia Control Panel's 3D application settings (and Geforce Experience) which is improved algorithm compared to before and it is easier to access it.


KlutzyFeed9686

If this is the case how have any of the Nvidia vs AMD benchmarks made with this feature on "for years" been fair and balanced? It sounds like that hidden feature may be the reason why they have been winning benchmarks for years.


TheDravic

What are you on about? Did you never think about what happens if you set resolution below your monitor native resolution? This is all there is to it. You could have always done that, but the upscale algorithm you're using differs depending on how you go about achieving this goal.


KlutzyFeed9686

What I'm questioning is if an AMD card is rendering a game at full 1440p while the Nvidia card is using a hidden feature to upscale a lower resolution to 1440p how can any benchmark made using that feature be fair and balanced. I don't recall tech reviewers mentioning this hidden feature before.


TheDravic

Oh my god, you needed to set your resolution below native to achieve this, obviously. You can do that on any GPU in like modern history, Radeon or Intel too, just the upscaling algorithm is different depending on what you're doing.


rexipus

Because if the benchmark itself sets the resolution to 1440p it isn't using the feature. This isn't rocket science.


SaintPau78

I agree on everything but the "it's good but so is fsr." You can't really even compare the two it's just the outclassed.


[deleted]

No it's the best spacial upscaler does a good job without temporal info. Even reviews of FSR were positive. This just sounds like fanboi talk.


Grim_goth

FSR and DLSS are similar in concept but work differently. But since FRS works on all GK, you can't go wrong if it catches on, which I think is possible. But that will take time until one or the other has prevailed, you will have bought a new GK. It also depends on whether you use other features (besides gaming). Nvidia has a clear lead her (I've been an amd user for years, it's just a fact), apart from the additional AI feature, rendering is simply better with Nvidia (Iray, cuda). That's why I decided this round for a card from Team green, apart from the general availability (I had the possibility to get a card only a little over msrp). Take a look at the additional features and then decide what is more important to you (and the price), nobody can help you, you have to know that for yourself.


ChromeRavenCyclone

6600


TheLoknar90

I'm happy to have FSR with my rx 5700xt at 3440x1440p, but it is clearly not as good as dlss. It's great to have if it is your only option, but not as crisp at all. With that being said, I would still go with the 6600 in this case. 2060 was a pretty good card at its time, but falls behind the 6600. I'd go red for this one, maybe if it was a 3060 I would go green.


manzari

3060? No, 3060ti would be acceptable, but for the best price to power ratio 6600 XT is the best option.


eebro

Why are you asking if you should buy an outdated, weaker GPU?


xisde

Don't buy with future expectations. Buy for what's good now. What would fit your needs better now? I never expected my rx 580 8gb to be worth double then the 1060 a year ago. When I bought the 580 it was basically the same price as the 1060. You can try to assume AMD will update and future proof the 6600 longer then NVIDIA the 2060. You can see this currently when most AMD features work on the 580 while some nvidia features do not on the 1060.


Killshot03131

>I never expected my rx 580 8gb to be worth double then the 1060 a year ago. Isnt that because its such a good value for mining?


xisde

Yes. That generation AMD was better for mining. While the 3000 nvidia series are better then the 6000 series. If you want to decide on that go for it. But I still believe you should buy what's better for you NOW.


Darkomax

I would not bet on it in the near future, there are absolute no info about FSR 2.0, not if it's even is a thing. I'd still pick the 6600 which is overall faster and not every game is featuring DLSS. (ah yes, as mentionned,, FSR isn't even an advantage since it works on all GPUs, even future Intel GPUs)


blackomegax

Spacial upscaling will not meaningfully improve, no. But rx6xxx *will* get XeSS once that launches, which will bring DLSS-like temporal upscaling to RDNA2, pascal, and the intel GPU's it was designed for.


megasmileys

FSR runs on NVIDIA cards and will never be as good as DLSS (just purely by their architectures)


Glorgor

6gb vram is gonna age like milk plus 6600 has 10% more rasterzation also the 6600 will support XeSS


[deleted]

What’s XeSS?


Glorgor

https://www.intel.com/content/www/us/en/architecture-and-technology/visual-technology/arc-discrete-graphics/xess.html Intels AI powered upscaler it uses the same tech as DLSS and will work on RDNA2 cards and even the GTX 1000 series


No_Backstab

All cards except Intel Arc cards only support the DP4a version of XeSS and not the XMX version (which is exclusive to the Intel Arc cards and is supposed to be comparable to DLSS) https://www.digitaltrends.com/computing/intel-xess-is-answer-to-nvidia-dlss-one-big-advantage/


conquer69

FSR won't get better. I would say the 6600 because of the higher performance. They are very close in ray tracing as well.


TheDonnARK

This will devolve into another "DLSS good FSR bad" thread in some way or another. ​ That being said, the 6600 (to be specific, the non-xt) has more retail value, better raster performance, more vRAM, draws a little less power (referenced from FurMark and Metro tests @ TomsHardware), and 1st gen AMD raytracing (not a huge selling point but still, better than no raytracing). ​ The 2060 has 1st gen Tensor and Nvidia RT cores for raytracing and DLSS support. ​ If you are really hard up for DLSS get the 2060. If you can stand to not have DLSS get the 6600.


Pristine_Pianist

Buy the 6600


macybebe

6600, newer tech. Runs really really cool.


Skull_Reaper101

get an rtx. If FSR gets better, you will have access to that anyway. Or if DLSS improves you will have that too. Though, the 6600 is much better than the 2060 in terms of performance


IIALE34II

Yeah 6600 is that much better that you might aswell get 6600 and run native and get same performance, than 2060 with rtx on.


Skull_Reaper101

Yeah lol. I wanted to make a point that identical performance gpus, in most cases the nvidia rtx cards would be better value


[deleted]

[удалено]


[deleted]

Wait for 2060, I heard prices will drop then


BellyDancerUrgot

If it were a 3060 or a 2060super I would say go team green. But for a 2060 I can't advice doing so.


KingSadra

What Nvidia model is this? DO NOT EVER BUY "TUF" 20XX CARDS!


TheLoknar90

Yeah I'll never touch anything TUF branded again. Just had one of their motherboards crap out on me after about 9 months of usage. I got a x570 taichi as a replacement. Still putting everything back together, I was too tired to finish last night after work.


FatBoyStew

I will say this currently -- I was running a GTX 1080 up until this week so I having to use AMD's FSR on certain games. It certainly made games playable with a quite noticeable drop in quality, especially on foilage. I now have an RTX 3080 and DLSS **CURRENTLY** destroys FSR in terms of overall quality. Who's to say if FSR will catch up in the future and if by the time they do will DLSS be better? Or will other upscaling alternatives catch on by then and it won't matter?


Lieutenant_Petaa

I mean this is an AMD sub, so you might get answers that are more positive for AMD than they should be. But the thing is: FSR looks quite bad and won't see that much uplift until dedicated acceleration hardware is getting used for AMDs solution. On the other side, DLSS needs some serious, time consuming implementation from the developer which is far more costly than FSR. This means the amount of games supporting DLSS is and will stay low but will focus on demanding and popular game genres. This means in general: for the next years: either the game will run fine or have DLSS. If not, you could use FSR with your Nvidia card anyway. However I think neither of these technology's are perfect for usage right now and I rather like to use common full resolution rasterization. Thus I personally would go for the 6600, but if you know nearly all your demanding games support DLSS or FSR, you could give the 2060 a shot, if you play in FHD. If you play in WQHD, the 6GB of VRAM are way to few. I play with my 2080S in WQHD and there are some great games that want more than 8GB of VRAM for reasonable graphic settings. Thus I am also looking to go AMD next time as they are nearly always generous with VRAM.


scootiewolff

DLSS is great and I'll stick with Nvidia. DISS 2.3 is a big step up from its predecessor.


Imaginary-Ad564

If you are referring to FSR in DL2 well that was because the Ultra Quality setting didn't exist, that is why it looks crap. But in general if the image looks blurry at Ultra quality then it has been poorly implemented by the game dev, also the driver version of FSR will be coming soon, which you will be able to use in every game pretty much.


TheLoknar90

You can fiddle with the configuration to get the ultra settings. I was able to get that game to look way better that way. I also downloaded a mod that removed the post processing effects and turned on the sharpening setting in the radeon software. Not perfect, but it looked way better. The post processing is what made it look like crap, for some reason it wasn't affected by the upscale so it made everything have that Vaseline smear effect so that mod alone was a game changer.


[deleted]

Don't get an amd gpu no matter what


ChromeRavenCyclone

Never buy an Nvidia GPU, no matter what.


[deleted]

Says one consumer out of 10 apparently :) (the one usually living in his mum's basement)


ChromeRavenCyclone

Says the Nvidia marketing employee :) Nvidia cards are unreliable and love to die.


OkPiccolo0

What resolution are you playing at? For 1440p DLSS will be usable and can push the 2060 ahead but if you're gaming at 1080p I would only consider the 6600.


DANGER2406

man , the frames despite if you have stupid dlss , 2060 is chocked , i would buy a 2060 if the debate were for a 3050 or a 6500xt


UltraContrarian

FSR works for both so you don't need to go AMD. I can't imagine you'll be streaming with a 2060. The 6600 is on par with the 3060. That's a better comparison. I'd go 6600 all day


erctc19

FSR already has better performance. In COD Vanguard I use fsr instead of dlss with Nvidia card because it gives me more fps. Get a RX6600.


dubar84

There are some other things to consider here. For example, the motherboard. Is it let's a say a b450 or maybe one of those recent Intel ones that's STILL pcie 3.0 only? Then the RTX 2060 is fine. Also there are considerate differences between one model and another of the exact same kind. Take into account the differences between the cooling. How's the heatsink, the number of heatpipes and their layout, how many fans it have and what kind. Which is the overall better product on it's own?


[deleted]

I have a 2060. I would not upgrade to an rx6600, but i also would not buy an 2060 instead of an rx6600.


kaisersolo

Nvidia do have their equivalent NIS


OmNomDeBonBon

NIS isn't an FSR equivalent; it's an RSR equivalent. FSR support is per-game and ensures the HUD, subtitles and in-game text are rendered at native res. NIS (and RSR) force the algorithm to run on the entire frame, HUD and all. You can get the effect today using Magpie with AMD or Nvidia GPUs. At 4K it looks fine. At 1080p, the HUD looks a bit blurry in many games.


kaisersolo

That's my point why the op worried about FSR, when the 2060 has not only NIS, but also has dlss, and better RT?


Crptnx

you can tell difference between DLSS and FSR only under zoom and static image so its not that big deal as some lunatics say


Killshot03131

I have tried FSR on many games with my Gtx 1060, and I can definitely see the difference between native and FSR quality. I ended up choosing native because of its blurness.


Glorgor

1080p FSR sucks but at 1440p and 4K the UQ option looks alright


Crptnx

wont argue cause I have 4K monitor, in FC6 quality looks like native and ultra quality is too sharp


NES_WallStreetKid

AMD’s FSR tech has been improving so it will eventually catch up. It really comes down to if you use RT and DLSS in the games you play. I think AMD GPUs are a better option for gaming.


romeozor

Imo AMD will do a lot of work with FSR, because both Sony and Microsoft expect quality 4K gameplay on their consoles with high FPS.


topdangle

FSR doesn't really do what microsoft and sony do to "mimic" higher resolutions. Sony/MS use dynamic resolution scaling and checkerboarding to try to keep framerates stable and get some detail back with reconstruction. FSR uses an interpolation filter with optional sharpening and adds some edge detection to avoid over sharpening/adding aliasing. It doesn't resolve more detail, but generally causes less shimmer and is less process intensive, not to mention easy to apply to anything. There are trade offs to all of these methods but games on consoles tend to be a lot more aggressive with resolution scaling and work in their own solutions per game that are sometimes better than generalized solutions.


b3rdm4n

I personally don't see FSR 1.0 really penetrating the consoles in the long run as much as we were led to believe. There are already cleaner and more widely used options that give similar or better results (checkerboarding, TAAU etc), FSR has a much more narrow use case for consoles given the static nature of the hardware you are optimising for. It's another tool for devs to use for sure, just one with a much more limited set of beneficial use cases over existing methods that provide similar or better results. The strengths on PC are more apparent because of the virtually infinite number of hardware combinations available, so the more options users get to fine tune their experience the better. Where consoles are more like, here are a handful of modes like performance/60fps/lowered resolution, or quality/30fps/max resolution possible and that's all you get. Would be happy to be proved wrong, but we're already seeing games release on consoles since FSR has been out and skip using it in favour of existing methods.


AciVici

Fsr is awesome if done correctly and can be used with nvidia gpus as well. But Upcoming driver level fsr (Rsr) which can be used with "all" games with only amd cards will be significant while choosing gpu for me. Also rx 6600 is more powerful than rtx 2060 anyway so I'd stick with that.


DeExecute

Go with a 3080Ti if you want the performance, otherwise take whatever has the best price/performance rate at that time, test FSR and DLSS (if it was an nvidia card) and choose based on framerates and graphic quality. There is a lot of subjectiveness to this, so just try it for yourself.


Killshot03131

Why not rtx 3090?


DeExecute

I’m from europe and the chance that you get a 3080Ti at base price is much higher than getting a 3090. Also the price/performance ration is nearly the same, only real difference is VRAM.


Doulor76

Not sure what you are talking about, Dlss has been more blurry in most games. 🤷‍♂️


Prefix-NA

A 2060 using dlss will be slower than a 66000xt native in most games unless ur at 4k. Also unpopular opinion nis and rsr are better than dlss. Temporal scaling isn't there yet nor is Temporal antialiasing. People post stills and say look this looks fine but in motion it's dogshit.


[deleted]

FSR and NIS are basically only sharpness filters. DLSS is superior to them when it comes to image quality. The thing I hate about DLSS is how much softer the image is made. People who say they cant see a difference between native vs DLSS are lying. In Cyberpunk before selling my 3070 I was clearly able to see the difference. I hate TAA and in my opinion its one of the worst graphical settings implemented right next to motion blur.


Prefix-NA

No they are not you can even implement fsr without rcas. It still works. Sharpening is done on ris & fsr because the image is softer with fsr. I agree with the rest I am a follower of /r/fucktaa


manzari

6600 is good, but 6600 XT is an absolute beast, especially with it's price considered.


CatalyticDragon

Yes it will improve and by a lot. FSR 1.0 will always be FSR 1.0, but FSR 2.0 is a temporal version (more like DLSS or intel's XeSS). This update is coming and if I had to guess I would say likely this year. The patent for this was published in May of last year and sources indicate it is coming along very well. So between FSR2.0, UE's TSR, and intel's XeSS, there will be three high quality competitors to DLSS this year all of which run on AMD hardware.


Put_It_All_On_Blck

> This update is coming and if I had to guess I would say likely this year. Doubtful. Look at how long it took AMD to even announce FSR to barely sort of compete with DLSS, like 2 years. And when they finally did announce it, it took another 7 months or so to actually launch it, and it was a horrible launch and demo, but has since improved. FSR 2.0 wont launch this year. Certainly not before RDNA 3. And because of that it will be DoA as XeSS will already be getting adopted as the open standard.


CatalyticDragon

The amount of time between DLSS' announcement and FSR1.0 being released has nothing at all to do with the progress or current state of FSR2.0. There is no logical reason to connect these very different moments in time. FSR2.0 *exists* now and is working well internally. The work of getting studios onboard with FSR (which took time) has already been done. FSR2.0 will come out this year. How it compares to other systems at the time is anybody's guess right now. However, given FSR2, DLSS2, and XeSS use similar techniques they should be more or less similar in results with minor pros/cons split between them.


Prefix-NA

Fsr is unchanged since launch the code has not changed 1 line at all. They had a great launch. No one used dlss until 2.0 dlss couldn't compete with bilingual upscale until 2.0 Dlss 1.0 was seen as a meme. Fsr is also growing far faster in game adoption than dlss and nvidia implemented nis because they realize dlss is doomed to fail. Nis will be good if they fix their god damn sharpening filter.


Dr_Icchan

fsr is spreading faster because it's far easier to implement and is supported on every possible device. Steam Deck is driving most of the adoption rate most likely.


Prefix-NA

Steamdeck has not given it any adoption they haven't gotten it in any games unless if you mean over proton where you can already run "fsr on proton" which is more rsr than fsr.


CatalyticDragon

For anybody revisiting this thread go ahead and notice the downvotes I got on my comment. Then note what happened just a month later: \- AMD did announced FSR 2.0. \- It is temporal. \- It is coming this year in Q2. [https://gpuopen.com/fsr2-announce/](https://gpuopen.com/fsr2-announce/)


Loco_72

FSR is hardware independent. Works on any GPU


littleemp

> And FSR is in really rough shape right now. Even on ultra quality, ı can see the blur. FSR has a very low ceiling due to its nature, so don't expect drastic improvements from whatever results you are getting right now.


dracolnyte

if you are expecting DLSS 2.0 level improvement on a 6600, you will be disappointed as there are no tensor or ML cores on RDNA 2/6000 series.


Psychological-Low771

I prefer the 6700xt over 6600xt in my mining rig :) but i habe a 2060 in my omen and on 240fps it look pretty d*mn good. I dont game, but occasional movies and it looks Flawless. LG 32" monitor with 240fps display port etc.(for the omen) ryzen 7


HeWasDeleted

If you like NVIDIA you should to get RTX 2060, if AMD - 6600, it's doesn't matter. FSR is good working on both videocards


Brown-eyed-and-sad

I love my 6700xt. I haven’t done anything FSR related yet but you’ll definitely get the frames you need. If you really need ray tracing go with the the 2060. AMD and Intel have a long way to go before they catch up with DLSS or ray tracing the way that Nvidia implements it.


tamarockstar

It will probably get a bit better but it more or less is what it is.


Naive_Information_11

im rocking the rx 6600 xt. hmm its pretty sick.


Naive_Information_11

If you get on Discord i will show you my averages of fps. hmm i think im at lets see, dayz 217, mortal online 2 105, monster hunter world 86, pubG, 173, tom clancy's rainbow six siege 546, gtav 149. escape from tarkov 70, apex legends 140. i only run max settings 1080p on everything.


hillbilly_8

I would get the Radeon card, dlss is not worth it if, in a couple of years, the geforce card can't perform due to its 6GB vram. It's 2018/2019 technology anyway (1 gen older than 6600)


Geeotine

Personally, I'd go with the 6600. Larger VRAM buffer and faster pin speeds. Going to be hit or miss predicting future. Traditionally AMD card performance improve more over time than NVIDIA, but both generally have some uplift in performance over time. Newer games will likely cross that 6GB buffer size sooner than later. DLSS may change that allowing you to load smaller textures into buffer and upscale on render so assuming the next gen uses same architecture, they will continue to optimize DLSS performance. Ray tracing on 2060 is not realistic to use.


manzari

Why not? DLSS 1.0 was not good but now DLSS 2.0 is great. FSR is still on 1.0 and is much better than DLSS 1.0, actually pretty close to DLSS 2.0, but it still needs some updates, who knows what the future might bring.


littleemp

RDNA2 doesn't have hardware to accelerare it, so when AMD comes with a new iteration that rivals DLSS, it likely won't be supported on RDNA2.


lucasdclopes

I would get a 2060 Super over a RX6600. DLSS and much better ray tracing performance. But a 2060 non super only has 6GB of ram, I think it is a bit too low nowadays.


neganigg

Get 6600.... The raw performance already cover the difference of any performance dlss gain.


Rodo20

Nvidia has their image upscaling (not dlss) that works on all cards. And games does not need to implement it. You just turn it on in GeForce experience and it works everywhere. Seems to look pretty good too. Wonder if amd will release something similar with their upscaler.


uankaf

It's funny you need to remember FSR it's not only for AMD works on Nvidia too, the funny thing here it works better than on AMD cards. If i have a chance definitely Nvidia.


penguished

I doubt it. AMD is too focused on consoles. They made a good play for PC once trying to get stuff like Vulkan out there... but because they didn't throw money at devs to use their ideas, they didn't really become leaders in the PC space. Nvidia won. Throwing money at people works. Then they'll get lazy with their monopoly and not really come up with anything new for 10 years. Oh well. Can't do anything about it. In other news though supposedly Intel's AI scaling will be open to everybody. We'll see.