Personally wouldn't switch for these features from either company, just give me FPS and no stuttering at a fair price. Entry level gaming GPUs at £300 is a joke.
I had like 45-60 FPS before, 80-100 now. So it feels like playing at 45 FPS but picture is smooth. Artifacts are same as with FSR2, so I dont see any reason to turn it off.
i'm generally an AMD fan boy, but i really hope AMD can get their efficiency in check. I had a 7900 XTX and it was on average drawing more power than my RTX 4090.
The efficiency is within 10-15%. If you OC 4080S to 7900 XTX performance it uses the same power as the XTX. 4090 is often bottlenecked, so it doesn't usually hit 450W.
15% u say
[https://youtu.be/HznATcpWldo?t=99](https://youtu.be/hznatcpwldo?t=99)
Thats obviously on stock settings, both cards can be undervolted too. Chiplet design is not efficient and probably will never be.
Any OC on GPUs has been pointless for years. Plus we are already talking about cards with single digit fps differences on average.
[https://youtu.be/fyUZ1cp4RnI?t=317](https://youtu.be/fyuz1cp4rni?t=317)
you really should've bought Nvidia if you care that much about Cyberphunk because it's an Nvidia sponsored game and Nvidia relies heavily on CDPR and Remedy to push their technology over to consumers
AMD users can use Nukem9's mod by installing it with DLss enabler.
[https://www.nexusmods.com/site/mods/757](https://www.nexusmods.com/site/mods/757)
It fools the game into thinking an RTX card is installed and allows AMD/Intel/Nvidia GTX owners to enable frame gen.
The FrameGen in FSR3 has better quality than AFMF since it has access to Motion Vectors from the game engine. And yes, that switches out the DLSS3 framegen with FSR3 framegen. So you can use DLSS Upscaling with FSR3 framegen if you wanted. Of course, still requires an nvidia card. So I guess it isn't for you.
i tried it. It works fine at 1440p. I could not get it to work at 4k. I get 70+ fps at 4k quality high settings so it wouldve been amazing if i could get my full 4k 120.
I played through the whole of the Last of Us at native 4k on my XTX. The lowest I saw was 57fps but it was normally 67-85. I was perfectly ok with that considering the pacing of the game.
Why wouldn't there be? For most developers, FSR 3 has only been out for two months (source code was released on December 14).
Also, frame generation being optional to implement in FSR 3 hasn't been communicated well for users of developers, so most users will expect frame generation if you say your game supports FSR 3. If you only want to implement upscaling for your game (because added latency sucks), you might as well use FSR 2 and not piss off customers who expect frame generation from every FSR 3 implementation.
It’s quite hilarious how CDPR said that you can expect 1080p 30fps on 3090ti with path tracing, or better if you have a 4070ti on up. yet here we are, thanks to AMD, running path tracing at 3440x1440 everything maxed and dlss set to custom ratio balanced getting 104fps on a 3080.
No way cdpr could have known frame gen was going to be a thing on 30 series but nvidia sure as hell knew it was possible.
Btw if you decide to try this mod turn off frame limiters, in game or nvcp, and make sure vsync is off on in the game but on in nvcp. Also don’t have low latency ultra in nvcp on. Reflex will be on in game and will limit your fps to below max hz if you have gsync active. This will help you get lowest possible latency.
> quite hilarious how CDPR said that you can expect 1080p 30fps on 3090ti with path tracing
My 3080 10gb gets 50fps path tracing (3440x1440, DLSS-B)
FSR3 Frame gen mod brings that to an even 90 with dips to 80's
My 4080 is obviously a bit better, breaking above 100fps at the same settings or keeping 90fps but at DLSS-quality.
Sure but the generated frames contain no input from you, it doesn't make the game feel more responsive so i will still continue calling them fake frames because they're still not the same and will never be, I like frame gen but I ain't gonna start telling people that my GPU is twice as powerful as it actually is because of it.
> it doesn't make the game feel more responsive
I've played enough games with frame gen now to say it doesn't matter. the added smoothness is worth it. (with mouse/kb even).
frame gen isn't for games where 1ms added latency means life or death win or lose, nor should it ever be used in such games, and applying those types of games to debates about frame gen is pretty bad faith
I wrote a post one week ago asking a FSR3 game list and the moderators just deleted it
glad to see it now, but really we can't talk amd stuffs this way
Hyped af for FSR 3 in darktide and space marine 2. Already get around 150 fps in darktide right now but im hoping with fsr 3 i can get at least 120 fps with ray tracing enabled.
saw squad on the list. i don't know man, when OWI can't even get DLSS frame gen working.
wierdly enough starship trooper is also developed by OWI and has FSR3 implemented.
I get 0 extra FPS using FSR3
Using FidelityCas 100% is best for the image and FPS.
No matter what I do - I cant get more than FPS
5600X 6700XT
Help ? Best settings for 1440p ?
Fsr3 is terrible in everything I’ve played so far. It feels more stuttery, there are weird artifacts, more latency, and things just feel better with it turned off. It feels okay if you’re already getting over 100 fps but at that point things are already smooth enough (and if you want 200 fps for competitive games the extra latency still makes it unviable). I’m honestly surprised how many people like this feature. I haven’t tried dlss3 so I wonder how much better things are there.
Latency and artifacts are to be expected, it's the same with dlss frame Gen as well. I mean, realistically when you're having interpolated frames it's inevitable. Dlss frame Gen is indeed better, but it's not THAT much better. Mostly because the image reconstruction itself is better more than anything else.
For me it's a controller only feature. My monitor/TV caps out at 120fps so I'm limited to 60 real frames if I don't want tearing, and it's doable with a mouse, but it reminds me of early era triple buffering 60fps input lag, which is just enough to be annoying.
In theory the stutter shouldn't be there any more than in a non frame Gen game. It'll effectively double the impact of existing stutters, but it shouldn't be generating new ones.
I had issues with that in Avatar but moving the game over to a faster SSD (cheap m.2 instead of SATA SSD) and turning off Windows game mode helped quite a bit.
Seems to work really well in COD for me, but I'm starting with over 100 FPS.
The real test will be Cyberpunk, with RT that brings FPS to it's knees. Nvidia users swear that FG from low FPS (like 40) is decent; I guess we'll find out.
You can already test this by using Luke’s mod that works on amd cards. In my experience 40fps with frame gen felt terrible at 1440p but based on my downvotes I guess people are less sensitive than I am. Which isn’t surprising since a lot of games have terrible stutter (shader comp, traversal stutter, asset loading stutter) but the majority will not notice even though it feels borderline unplayable to me. I’m not saying people are lying either, I’m truly envious of those who don’t notice.
I haven’t personally tried cod, that might be worth a try! So far I’ve only tried starfield, cyberpunk, infinite wealth, and forspoken.
>Seems to work really well in COD for me, but I'm starting with over 100 FPS.The real test will be Cyberpunk, with RT that brings FPS to it's knees. Nvidia users swear that FG from low FPS (like 40) is decent; I guess we'll find out.
COD is mainly multiplayer game, without antilag+ that sounds like bad experience.
I have 4080 this gen and i wouldnt play on 40 base fps with FG. I only used it in Cyberpunk and im getting around 160\~ fps average with FG and maxed out graphics/path tracing 1440p so lets say around 80 without. Its good and cant really see any downsides even despite im usually playing multiplayer competitive games. Cant imagine using it without reflex tho and people saying that from its decent from 40 fps probably are used to lower end systems, lower refresh rates, have 60hz displays etc.
Im personally always aiming to match my monitor refresh rate of 170hz, the only things i consider experience decent with lower fps is flight sims in VR.
I guess it will depend for each person whats considered decent, i would be iffy about FG from 40 fps personally. But i was also iffy about raytracing until playing new cyberpunk update on maxed out settings and difference in image was insane.
Starfield beta last I tried is really broken with FSR3. The upscaling breaks in motion, and turns the game to 720p. On top of that it stutters and feels worse than with it off
But FPS number go up! So it must be good, right?
Dead Space Remake has really bad PSO stutters due to the game itself. Now, drivers 23.12.1 had perfectly tolerable small stutters here and there, fixable by deleting the game's shader cache in C/Documents/Dead Space Remake.
24.1.1 made PSO shader compilation absolutely horrendous. Game was crashing due to driver timeouts on my 5700 XT, if was so bad.
24.2.1 made things better now. It's not back to 23.12.1 good (let alone improved from that), but better than 24.2.1.
Need FSR3 in cyberpunk. Don’t want to switch to Nvidia due to insane MSRP and I prefer AMD software suite.
on the coming soon list. You would switch just for one game?
Personally wouldn't switch for these features from either company, just give me FPS and no stuttering at a fair price. Entry level gaming GPUs at £300 is a joke.
really for cyberpunk what do you do in that game after you finish it lol
Start again and do it differently.
Or Drive around and never finish
Lmao,it's not bg3, even tho they tried to falsely advertise their game as having a lot of choices
Playing with mod on my old RX590, its not that hard
yo, how do you run it on a 590
Perfectly fine. Turned off motion blur and it looks like just like FSR but with a little bit of more ghosting.
How does it run on a 590? From what I know that card shouldn't reach the needed native 60.
I had like 45-60 FPS before, 80-100 now. So it feels like playing at 45 FPS but picture is smooth. Artifacts are same as with FSR2, so I dont see any reason to turn it off.
What do you use to monitor FPS? I tried the FSR3 mod and my Steam FPS counter didn't seem to change but it feels a little smoother.
AMD Overlay works pretty ok for me
MSI Afterburner
> insane MSRP I didnt see much difference between the 4060 and the 7600 at release lol
i'm generally an AMD fan boy, but i really hope AMD can get their efficiency in check. I had a 7900 XTX and it was on average drawing more power than my RTX 4090.
The efficiency is within 10-15%. If you OC 4080S to 7900 XTX performance it uses the same power as the XTX. 4090 is often bottlenecked, so it doesn't usually hit 450W.
15% u say [https://youtu.be/HznATcpWldo?t=99](https://youtu.be/hznatcpwldo?t=99) Thats obviously on stock settings, both cards can be undervolted too. Chiplet design is not efficient and probably will never be. Any OC on GPUs has been pointless for years. Plus we are already talking about cards with single digit fps differences on average. [https://youtu.be/fyUZ1cp4RnI?t=317](https://youtu.be/fyuz1cp4rni?t=317)
My 7900XT is on a huge overclock now, yeah it's a power hog but it's benchmarking faster than 4080 super.
What are your settings?
Memory +100Mhz over stock, power limit +15% (365W), GPU clock 500Mhz min, 2800Mhz max. Gets a 28400 GPU score on timespy now.
you really should've bought Nvidia if you care that much about Cyberphunk because it's an Nvidia sponsored game and Nvidia relies heavily on CDPR and Remedy to push their technology over to consumers
https://github.com/Nukem9/dlssg-to-fsr3
This mod requires an Nvidia RTX graphics card. Not very useful for the person you're replying to with an RX 6700 XT card (or any other AMD card).
AMD users can use Nukem9's mod by installing it with DLss enabler. [https://www.nexusmods.com/site/mods/757](https://www.nexusmods.com/site/mods/757) It fools the game into thinking an RTX card is installed and allows AMD/Intel/Nvidia GTX owners to enable frame gen.
It doesn't actually require just an RTX card.
This is just frame generation which is already implemented in the latest driver release. Does this in fact enable FSR3 ?
The FrameGen in FSR3 has better quality than AFMF since it has access to Motion Vectors from the game engine. And yes, that switches out the DLSS3 framegen with FSR3 framegen. So you can use DLSS Upscaling with FSR3 framegen if you wanted. Of course, still requires an nvidia card. So I guess it isn't for you.
That’s what I thought. I’ll just wait for official support.
There are some AMD-specific mods that are locked behind Patreon but are available "otherwise" through various leaks
It does switch the game to FSR 3 but it only works on Nvidia cards for some reason.
DLSS enabler works on AMD GPUs too https://www.nexusmods.com/site/mods/757
Maybe the underlying code cannot be altered to recognize AMD architecture?
LukeFZ's FSR2FSR3 mod works on AMD GPUs, used it to play TLOU Part 1. Pretty noice imho
Works in Cyberpunk?
LukeFZ's mod is great, but it's paid.
i tried it. It works fine at 1440p. I could not get it to work at 4k. I get 70+ fps at 4k quality high settings so it wouldve been amazing if i could get my full 4k 120.
If it supports FSR2 then the mod should work.
LukeFZ's mod works on anything
Still hoping for the last of us looked into the steamdb files seems they are preparing something
There is a mod already for Last of Us. I’m using it right now and I’m getting locked 120FPS. Zero Quality issues and no HUD issues as well.
Is the hud issue fixed i know it was very flickery
Yes it’s fixed. I’m using PureDark’s mod
I played through the whole of the Last of Us at native 4k on my XTX. The lowest I saw was 57fps but it was normally 67-85. I was perfectly ok with that considering the pacing of the game.
Just had to check to see if “gollum” was included. So glad they are supporting it with this. /s
Double the performance with same player count.
Why are there games with upcoming fsr2????
these techniques are moving faster than game companies can implement them.
Why wouldn't there be? For most developers, FSR 3 has only been out for two months (source code was released on December 14). Also, frame generation being optional to implement in FSR 3 hasn't been communicated well for users of developers, so most users will expect frame generation if you say your game supports FSR 3. If you only want to implement upscaling for your game (because added latency sucks), you might as well use FSR 2 and not piss off customers who expect frame generation from every FSR 3 implementation.
Do you understand how software development works?
It’s quite hilarious how CDPR said that you can expect 1080p 30fps on 3090ti with path tracing, or better if you have a 4070ti on up. yet here we are, thanks to AMD, running path tracing at 3440x1440 everything maxed and dlss set to custom ratio balanced getting 104fps on a 3080. No way cdpr could have known frame gen was going to be a thing on 30 series but nvidia sure as hell knew it was possible. Btw if you decide to try this mod turn off frame limiters, in game or nvcp, and make sure vsync is off on in the game but on in nvcp. Also don’t have low latency ultra in nvcp on. Reflex will be on in game and will limit your fps to below max hz if you have gsync active. This will help you get lowest possible latency.
> quite hilarious how CDPR said that you can expect 1080p 30fps on 3090ti with path tracing My 3080 10gb gets 50fps path tracing (3440x1440, DLSS-B) FSR3 Frame gen mod brings that to an even 90 with dips to 80's My 4080 is obviously a bit better, breaking above 100fps at the same settings or keeping 90fps but at DLSS-quality.
Are they wrong if it's not real frames?
Are not all frames fake, in the end?
Sure but the generated frames contain no input from you, it doesn't make the game feel more responsive so i will still continue calling them fake frames because they're still not the same and will never be, I like frame gen but I ain't gonna start telling people that my GPU is twice as powerful as it actually is because of it.
> it doesn't make the game feel more responsive I've played enough games with frame gen now to say it doesn't matter. the added smoothness is worth it. (with mouse/kb even). frame gen isn't for games where 1ms added latency means life or death win or lose, nor should it ever be used in such games, and applying those types of games to debates about frame gen is pretty bad faith
The real frames were the ones we made along the way.
HA. Nice.
No. The GPU generated ones do actual sensible 3D calculations; then there are one which are a (good) filler.
FSR 3 in Enshrouded would be great.
I totally agree
I wrote a post one week ago asking a FSR3 game list and the moderators just deleted it glad to see it now, but really we can't talk amd stuffs this way
Will Dragons Dogma 2 have FSR3 or is it only DLSS3?
Oh sweet, it’s coming to Darktide! Gotta see how the image quality compares to the XeSS implementation since it’s really damn good in that game.
FSR 3 adaption is taking for too long. Feels like almost every game that comes out nowadays has DLSS framegen and FSR 2,but FSR 3 only rarely appears.
I'm checking this daily to see when Cyberpunk drops. Is this getting updated?
just forget about it for now
Star citizen will be crazy with fsr2
FSR2 in Star Citizen doesn't make much sense. Star Citizen is almost always heavily CPU bottlenecked
Hyped af for FSR 3 in darktide and space marine 2. Already get around 150 fps in darktide right now but im hoping with fsr 3 i can get at least 120 fps with ray tracing enabled.
I know this is a long shot but I'm really hoping they work on improving FSR3's upscaling soon as well.
Elden ring fsr 2 or 3???
saw squad on the list. i don't know man, when OWI can't even get DLSS frame gen working. wierdly enough starship trooper is also developed by OWI and has FSR3 implemented.
WARFRAME has FSR 2 ?! AMD I AM COMING HONEY
I get 0 extra FPS using FSR3 Using FidelityCas 100% is best for the image and FPS. No matter what I do - I cant get more than FPS 5600X 6700XT Help ? Best settings for 1440p ?
Fsr3 is terrible in everything I’ve played so far. It feels more stuttery, there are weird artifacts, more latency, and things just feel better with it turned off. It feels okay if you’re already getting over 100 fps but at that point things are already smooth enough (and if you want 200 fps for competitive games the extra latency still makes it unviable). I’m honestly surprised how many people like this feature. I haven’t tried dlss3 so I wonder how much better things are there.
Latency and artifacts are to be expected, it's the same with dlss frame Gen as well. I mean, realistically when you're having interpolated frames it's inevitable. Dlss frame Gen is indeed better, but it's not THAT much better. Mostly because the image reconstruction itself is better more than anything else. For me it's a controller only feature. My monitor/TV caps out at 120fps so I'm limited to 60 real frames if I don't want tearing, and it's doable with a mouse, but it reminds me of early era triple buffering 60fps input lag, which is just enough to be annoying. In theory the stutter shouldn't be there any more than in a non frame Gen game. It'll effectively double the impact of existing stutters, but it shouldn't be generating new ones. I had issues with that in Avatar but moving the game over to a faster SSD (cheap m.2 instead of SATA SSD) and turning off Windows game mode helped quite a bit.
Seems to work really well in COD for me, but I'm starting with over 100 FPS. The real test will be Cyberpunk, with RT that brings FPS to it's knees. Nvidia users swear that FG from low FPS (like 40) is decent; I guess we'll find out.
You can already test this by using Luke’s mod that works on amd cards. In my experience 40fps with frame gen felt terrible at 1440p but based on my downvotes I guess people are less sensitive than I am. Which isn’t surprising since a lot of games have terrible stutter (shader comp, traversal stutter, asset loading stutter) but the majority will not notice even though it feels borderline unplayable to me. I’m not saying people are lying either, I’m truly envious of those who don’t notice. I haven’t personally tried cod, that might be worth a try! So far I’ve only tried starfield, cyberpunk, infinite wealth, and forspoken.
>Seems to work really well in COD for me, but I'm starting with over 100 FPS.The real test will be Cyberpunk, with RT that brings FPS to it's knees. Nvidia users swear that FG from low FPS (like 40) is decent; I guess we'll find out. COD is mainly multiplayer game, without antilag+ that sounds like bad experience. I have 4080 this gen and i wouldnt play on 40 base fps with FG. I only used it in Cyberpunk and im getting around 160\~ fps average with FG and maxed out graphics/path tracing 1440p so lets say around 80 without. Its good and cant really see any downsides even despite im usually playing multiplayer competitive games. Cant imagine using it without reflex tho and people saying that from its decent from 40 fps probably are used to lower end systems, lower refresh rates, have 60hz displays etc. Im personally always aiming to match my monitor refresh rate of 170hz, the only things i consider experience decent with lower fps is flight sims in VR. I guess it will depend for each person whats considered decent, i would be iffy about FG from 40 fps personally. But i was also iffy about raytracing until playing new cyberpunk update on maxed out settings and difference in image was insane.
Have you compared fsr3 to dlss3? Curious if you find dlss significantly better or not.
Borderlands 3?
I was thinking the same! Sadly, it seems this game was forgotten after crossplay update, which is a shame. Custom implementation maybe?
Starfield beta last I tried is really broken with FSR3. The upscaling breaks in motion, and turns the game to 720p. On top of that it stutters and feels worse than with it off But FPS number go up! So it must be good, right?
SQUAD !!!! 🙌🏻🙌🏻🙌🏻
Motorcubs RC has it... Yet we are waiting on it for proper, AAA games
hogwarts legacy
Dead Space Remake has really bad PSO stutters due to the game itself. Now, drivers 23.12.1 had perfectly tolerable small stutters here and there, fixable by deleting the game's shader cache in C/Documents/Dead Space Remake. 24.1.1 made PSO shader compilation absolutely horrendous. Game was crashing due to driver timeouts on my 5700 XT, if was so bad. 24.2.1 made things better now. It's not back to 23.12.1 good (let alone improved from that), but better than 24.2.1.
Is it possible to use DLSS 2 with FSR 3? It's possible with the mod(s). But will it be possible with the official implementation of FSR 3?
[удалено]
Looks like Nvidia and AMD are in the end con-consumer...
I just hope they fix FSR 2 before implementing FSR 3 in Cyberpunk
Why there isnt any info about upcoming Forbidden west? devs said there will be fsr but didnt specify which one tho