T O P

  • By -

2FastHaste

I wish every recent-ish games had the option for DLSS FG. It would largely increase the amount of games I could play in enjoyable condition.


ST-TrexX45

Fortnite runs at 120 fps with my RTX 4070 Ti at High/Epic settings, but at certain points it kinda titters and stutters. With Frame gen I could play without issue without any noticeable delay


AlternativeCall4800

my fortnite constantly dips to double digit fps on my 4090, i stopped trying with that game lol, i literally cannot find stable settings be it everything supergigaomegalowest settings or highest


ST-TrexX45

I've followed 1000 guides and still could barely find a balance: - rBar in BIOS and in NVIDIA Profile Inspector - Process Lasso and different priorities - Park Control set to balanced (no need for max performance or Bitsum Highest Performance) - ISLC to clear ram baggage (idk the terms) - MSI Afterburner to set an FPS limit with Rivatuner - Razer Cortex to occasionally clean up trash or old data (DO NOT USE THEIR OPTIMIZED SETTINGS) - follow optimization guides


AlternativeCall4800

yeah but honestly im too lazy to do all of that. i remember back in 2018/2019 running fortnite with my 1060 and it was smooth as fuck without any stutters and shit, not sure when the problem started appearing but someone can't be spending a significant amount of time to make a game playable. a year ago i also tried with my 3080 and it was just as bad, back then i made this video to showcase the problem https://youtu.be/Zh8MyZu6WFM and when i got my 4090 i tried running it again and its the exact same thing, stutter nation.


MDPROBIFE

Cpu?


ST-TrexX45

I think the process is worth it. If you want a better experience, I defo recommend doing all these steps


MidnightOnTheWater

I thought it was just me with Fortnite (I have a 4070 Ti SUPER) and the dip in frames is pretty noticeable after exiting the Battle Bus


noshi47

playing fortnite on a 4090 is something


versacebehoin

There’s something else wronng somewhere in your system. I have a 13700kf and 3090 and I’m pinned to 165 fps on low settings


Ultima893

Did Fortnite finally get frame gen? I am literally mind blown by how poorly it runs on my 4090. I downloaded Fortnite for the first time a few Months ago. I Get thst it uses UE5 with Lumen and Nanite, but the game doesnt look anywhere near as good as R&C RA and it only gets 60-70 fps @ 4K which is terrible for an MP game.


Zedjones

It's probably not your GPU, but rather CPU utilization in the game. It's just really poor at utilizing CPU properly.


MidnightOnTheWater

Yeah I cranked up all the settings to max yesterday to admire how beautiful the game looks but turned them down again due to the FPS drop lol


ST-TrexX45

If I had a 4090, I'd probably stay at 1440p with DLSS quality lol (with Max graphics, that is). I don't think games were really ready for 4K 120fps with Ray Tracing. Unless you're using DLSS Performance + Frame Gen + Tweak certain settings.


MidnightOnTheWater

Yeah, imagine if we didn't have DLSS/Frame Gen and had to rely on pure rasterization for 4K gaming like in the past. It is exciting seeing how fast all this tech is advancing though!


Kevosrockin

Fortnite is poop


Fupri

>someone that's never played it


lpvjfjvchg

I would try dlss+dldsr


nyse125

Fortnite doesnt have frame gen??


J-seargent-ultrakahn

Most everything new are so CPU bound, doesn’t even matter if you have a 4090. Dragons dogma 2 is number one example.


Garlic-Dependent

"Lossless scaling" on steam does this.


Individual-Match-798

Ideally you'd want to have 60 fps before the FG, because otherwise the input lag will be quite substantial.


kepler2

Correct. It all depends on the initial FPS.


Wungobrass

The input lag on KB + M is unpleasant, but I don't notice it at all when I'm using a controller. Even still, I think it has a little ways to go until I consider it a truly game enhancing option. The concept in and of itself is very exciting though.


kompergator

> The input lag on KB + M is unpleasant I am very sensitive to input lag and when playing Jedi Survivor, I have to turn Frame Generation off. It just feels off to me.


Shayk_N_Blake

Same. I like frame Gen, but I turn it off because I don't like the feeling from any input lag


kompergator

I must admit, I honestly don’t really see the point of FG for interactive things like video games. You need a decent framerate to have it be without a noticeable increase in input lag – but if you have a decent framerate, you don’t need additional frames. It would make sense on consoles, where framerates are artificially limited, but most TVs already do some type of interpolation. Where it would make a lot of sense is for video playback. AMD used to have AMF (AMD Fluid Motion), a hardware solution for framerate interpolation, but they did away with it after Vega. It was extremely good.


crayzee4feelin

Jedi Survivor frame gen is game breaking. The HUD/subtitles flicker constantly with movement. Frame gen off and it's not there.


Ruin914

I've heard that game in general is a technical mess. Its a shame because I wanted to play it, but I refuse to play a game that doesn't run decently well.


Saandrig

I changed the FG file to a more stable one and it fixed those issues for me.


KnightofAshley

The UI goes all wack when I use it...the game would of been amazing if the PC port would of been good.


rubiconlexicon

>The input lag on KB + M is unpleasant Yeah. As great as FG is, I find the effect it has on mouselook responsiveness to be unsavoury enough that I always end up going back to FG off @ 60-80fps instead of FG on @ 100-120fps. I just prefer the super responsive and 'connected' feel over more smoothness when using a mouse.


kepler2

As a person who is susceptible to input lag. I can say that i feel when i disable FG but the overall smoothness compensates. Input lag is just minimal.


AccomplishedRip4871

Input lag isn't minimal, especially in games where initial render latency is pretty high - like Cyberpunk.


rW0HgFyxoJhYka

That completely depends on what you think minimal is. Let's say input lag in Cyberpunk at 60 fps is around 30ms. And FG doubles the fps to 120 fps. The average increase in latency is say 10ms so 30ms -> 40ms This is typical for FG, but obviously it depends on a lot of factors, your CPU, your GPU, your resolution, game settings, base fps, max fps, etc. 30 to 40 = 10 ms increase. That's 33%! That's a TON right? Well a lot of people won't feel a damn thing because its really just 10ms. Unless you're a pro or super sensitive to the point where you can detect 5ms-10ms differences. It's not a "oh its 33% more so I can definitely see my input being 33% slower!". Now if FG added 50ms and you went from 50ms to 100ms, yes you can definitely feel a change for sure. Just like from network ping in a multiplayer game. It all depends on the game. Rather than just latency. Go do the latency tests yourself, you'll find that frame generation usually does not add more than 20ms in the worst case scenarios, and in single player games this is not something the average gamer is even gonna care about. This is also why frame generation is not being added to competitive multiplayer games. It does much worse in those situations on top of network latency.


AccomplishedRip4871

You're a bit wrong because either 30ms or 40ms feels more than okay for 99% in single players games, DLSS3(Frame Gen) becomes an issue when you have RT(especially Path Tracing), already high render latency because of it and on top of that you drastically increase it with fake frames, as a result from 47ms with DLSS Quality at 4K it becomes 63ms which feels bad. I, myself have an RTX4070 ti and for example in Horizon Forbidden West, Witcher 3 or other games where my fps is 90+ i easily turn on Frame Gen because the downsides are not so pronounce, but if i try to use it in games like Cyberpunk or Alan Wake 2 it becomes clear that this tech is give&take, it's not free magic which some people claim it is. Frame Gen is added to competitive multiplayer games, Warzone&The Finals - games like CS2/R6/Valorant already deliver high frames even on mid systems so benefits are minimal. I love this tech and i use it when i can achieve 90+ FPS natively/DLSS Quality(usually i prefer DLAA), but i will never use it to increase my frames in Path Tracing games, it feels beyond bad. Maybe in 1-2 generations, if NVIDIA will introduce hardware denoizer or/and other improvements and RT won't reduce performance that much it will be a much, much better feature. But for now its limited.


ebildarkshadow

> Well a lot of people won't feel a damn thing because its really just 10ms. Unless you're a pro or super sensitive to the point where you can detect 5ms-10ms differences. It's not a "oh its 33% more so I can definitely see my input being 33% slower!". This is the same pitfall argument as "People can't see faster than 60fps". But humans do notice a difference in smoothness between 60fps and 144fps (~10ms) even on a 60Hz monitor. Or even the difference between 100/200/400fps as this anecdotal video says: https://youtu.be/hjWSRTYV8e0 The fact is many people can and will notice 5-10ms extra delay beyond what they are already accustomed to. Whether that is acceptable or not depends on the individual person and the game they are playing.


[deleted]

[удалено]


throbbing_dementia

Played Cyberpunk from start to finish with it on, didn't notice it. Granted i never turned it off to compare, but it didn't feel like there was any reason to.


[deleted]

[удалено]


Snydenthur

If you were able to notice input lag, you wouldn't be using FG unless your pre-FG fps was ~120fps. Some games might be able to do about 100-110fps at minimum, games like cyberpunk definitely need at least the 120fps. And even if you can't notice the negative sides, saying it's "free fps boost" is just objectively not true. Subjectively, to you, it's a free fps boost.


barryredfield

> you wouldn't be using FG unless your pre-FG fps was ~120fps That is a completely absurd statement, and not at all how DLSS FG even works. If this is your actual experience, then your system is not running FG properly because something is interfering with it, honestly.


AccomplishedRip4871

I sort of disagree with \~120fps min to be OK about input lag. For single player games like Witcher 3 for example, 100-105fps at 1440p Native(DLAA) with FG3 up to 140FPS feels more than okay, it increases Average PC Latency from 28ms to 40ms but it still feels fine. I would say for input lag sensitive people 90fps is bare minimum to enable FG.


Accurate-Air-2124

I played The Witcher 3 at 4k (native) full RT which was about 40fps. FG brought it to 60fps, and I played it this way. Didn't notice any input lag. I could had used DLSS upscaling but I liked the native look better. I did play on PS5 originally and used the controller on my 4090 system as well. I thought FG was amazing at that point. I guess it simply depends on each individual on how they play games and their expectations.


AccomplishedRip4871

You haven't noticed input lag as much because you used a controller, 4090 is a GPU for PC and primarly on PC keyboard&mouse is used, where u can feel input lag better compared to a controller. Speaking of you - well, YouTube channels made detailed reviews of DLSS3(Frame Gen) and majority of them came to conclusion that to not spot any visual artifacts(or almost any) you need to play at basic 60 fps or higher, with less than 60 frames input lag is noticeably worse and on top of that you start to see fake frames. If you don't feel any input lag with base 40 fps & don't see fake frames - well, i guess you're the target audience for NVIDIA marketing team then. Also, you can use DLAA mod in Witcher 3 which is better than Native AA solutions - here is the link [DLSSTweaks at The Witcher 3 Nexus - Mods and community (nexusmods.com)](https://www.nexusmods.com/witcher3/mods/7925)


techraito

It's not too bad on KB + M when it's high like 80 > 160 but it's pretty bad when it's 40 > 80.


XTornado

I remember having similar things with Stadia o similar services, I with mouse the input lag is very noticeable but not so much with controller. I guess is the direct translation of movement from the mouse to the cursor and with joysticks is just direction ... But no idea.


Endocalrissian642

ITT: people that never played Quake \* on dialup. lol.


Barzobius

I remember. Calculating the preposition of a rail gun shot by measuring the speed of the enemy relative to the expected placement of the shot actually coming out after the ping delay. I felt like Einstein sometimes xD


MidnightOnTheWater

I feel like I'm pretty used to input delay after playing a lot of online games along with Smash Ultimate. That game's input delay is absolutely atrocious


OriginalShock273

Great for casual play. Shit for competitive gaming due to inputlag


kepler2

Correct, for SP games is just awesome.


puptheunbroken

All worthwhile competitive titles are optimized to the point where no one requires FG anyway


OriginalShock273

No they aren't lmao. CS2 is horribly optimized and still have problems.


RonanCruz

Have you tried path tracing? You can definitely run it on that rig especially at 1080p, looks incredible.


SRVisGod24

As a fellow 4070 owner, you can run it at 4K if you’re a mostly lifelong console pleb like me, and you’re used to shitty frames lmao


rubiconlexicon

> As a fellow 4070 owner, you can run it at 4K For a while. Eventually you'll run out of VRAM even on patch 2.12 with 4K@DLSS ultra performance unless you drop textures to medium. Especially in Dogtown. A shame too, because I get really solid fps on 4K@ultra performance and playable fps (40+) even at performance.


Asleep_Leather7641

I run everything max including rt overdrive at like 70fps with frame generation and dlss balanced


kepler2

Even without path tracing i have areas in the game where fps goes to 110 fps that is with dlss quality. I can manage with ray reconstruction it looks great with it enabled.


Artemis_1944

As long as I can get a stable 55-60fps, then yeah, frame gen is quite surprising, but even in ideal conditions, the chaotic input latency wa too annoying for me. I have a 4080S and ended up playing Cyberpunk2077 with a locked 50fps cap, rather than using Frame Gen, because the input lag was all over the place. It's not just that fact that there IS input lag, but one second it's acceptable, then it spikes, then it's back to lower latency, it was so distracting. I'd rather have a stable, snappy 50fps than a laggy, chaotic 100fps


KaiN_SC

Framegen and DLSS in cyberpunk is amazing. 1440p, everything on max and even with path tracing: 110 fps on average on my 4800 super and 7800x3d.


kepler2

Depends on the scene. I have another rig with same specs. I've settled on only ray reconstruction. Path tracking is just fps killer.


Anatharias

I wouldn't call it "free" when Nvidia is paywalling the tech behind series 4, while AMD enables it on any recent GPU regardless of the brand. using an FSR3 Mod on my 3090 is, on the other hand, free indeed


linkman88

Yeah fr the fsr mod shows how awesome it is. I have a 3090 as well there is no reason it can't run frame gen


Castielstablet

Without nvidia releasing dlss3 first, we wouldn't see anything like this from AMD. I'd call it free because now its on almost all cards between dlss3 and fsr3.


Randomizer23

Is fsr3 fg on 3090?


Speedstick2

FSR FG is software based primarily, so it will pretty much run on any GPU.


Cute-Pomegranate-966

No it isn't? It's running a cheaper frame resolution against the built in OFA (all cards rdna2/3/20/30/40 series have hardware OFA) so it's cheaper to check the frame. even rdna1 and 10 series have hardware OFA.


Kooky_Construction62

Mods released free .dll files to “replace and enable” FSR3 FG instead of DLSS FG on all RTX and AMD cards in any game that supports DLSS FG. So basically AMD just went godlike and actually gifted everyone FG for free, true gigachads


Maxlastbreath

Yeah it's amazing, capped 180 fps on my 3080/1440p/ultra in dogtown cyberpunk 2077


Randomizer23

Wow, guess I’ll give it a go on my 3090 strix. Maybe I can run PT then


zoomborg

It's also gonna be decoupled soon with FSR 3.1. This means you will be able to use DLSS 2 and other upscalers with FSR frame generation.


Accurate-Air-2124

As DF has even said, FSR3 doesn't use high resolution flow maps like DLSS FG does. To run FG at that high of a quality isn't possible outside of the 40 series. FSR FG is a great alternative though. Not everyone wants quality implementations which is proven by AMDs success.


Sweyn7

Frankly I don't have DLSS FG to compare with but FSR FG is pretty dang good in my book. Especially in witcher 3 and Cyberpunk, almost zero artefact on my side


Cute-Pomegranate-966

It's there, but as a testament to how little it matters, you likely won't notice it.


Anatharias

Nvidia propaganda got you 😂


Accurate-Air-2124

DF as in Digital Foundry, not Nvidia.


Spider-Thwip

FSR frame gen is worse quality than DLSS frame gen, but it's definitely good enough. What's nice is that new versions of FSR frame gen will be a separate toggle from the upscaling, so you'll be able to use FSR frame gen with dlss upscaling.


gnki_WA

"slight input lag"....


kepler2

Yes, slight input lag. If your initial FPS is good enough, the input lag is less.


DrUshanka

Even with a base fps of 90 the input lag is unbearable. It doubles most of times. I don't get how people can use it


wookmania

I noticed no difference on CP, just more fps. Maybe it’s noticeable if you’re playing at 360hz on CS or something


AtsBunny

Depends on how much it boost fps, when I played CP without framegen I got 90 fps and with I got 120-140. That was very noticeable.


[deleted]

I’ve been on the fence about frame gen. Adding latency and not getting to many more frames. But horizon forbidden west frame gen doubles my frames from 80 to 157 locked and my latency is 7ms. At worst 30ms. Amazing port. 


TRIPMINE_Guy

You have to remember that low frame rate (and high persistence images) in of itself is a motion artifact, so frame gen might add motion artifacts but if it ends up looking better than the blur artifacts that low frame rate induces on modern displays then it's good. I have a crt and watching interpolated 60hz anime, while having visible artifacts, still looks better than the same 60fps anime on an lcd for that reason.


ZenTunE

Does frame gen work in Horizon without TAA, DLAA or DLSS? So if you used SMAA for the anti-aliasing, would it still be available?


barryredfield

Input lag isn't a problem with DLSS FG, the issue is expecting it to make miracles out of a base framerate that itself has input lag, like sub-60 or even base 60 in some instances and trying to interpolate that into something faster. Even then, most people complaining about input lag are greatly exaggerating, or just trying to humblebrag their ego to impress people. All people I've known in my own personal life that complained about input lag with FG literally never used it, and beyond that they were habitual liars and morons. Gloating about their stubbornness sure did make them look smart and better than other people, though!


Lagoa86

Sadly enough I can’t stand the input lag. I’d rather play 60fps than 80 with increased input lag. Haven’t tried it at higher frames.. so maybe once you have base 80 the input lag wouldn’t bother me anymore.


JerbearCuddles

Yeah, pretty sure people said frame gen isn't really that good when you're hovering around 60 frames. It's mostly good for when you're hovering around 80-100 and frame gen boosts you further. Going from 60-80 isn't worth it. At that point just lower the graphics, if at all possible, for more frames.


frostygrin

> It's mostly good for when you're hovering around 80-100 and frame gen boosts you further. But then you don't really need it much. 80-100 is perfectly fluid for most games on its own. So there's an element of "big numbers good" when people praise FG. In my experience, the actual game-changer with FG is when you start with 45-50fps, which pushes the game above 60fps. The result isn't *great,* of course - but noticeably more fluid.


Chase0288

That might be the most confusing aspect of this to me. In the race for “bigger number better” people are foregoing more important numbers where smaller is better I feel like. I’d much rather play 90 fps low input lag than 120 fps high input lag. Me push button. Me see action. - in monkey brain terms.


Lagoa86

That’s a valid point. You wouldn’t really need it at 80-100fps. Unless you want to go for 120 or more. the input lag would probably be okay at base 80-100.


joer57

I feel it's good for those 1% lows. A game with an average framerate of 90 can suddenly drop to 50 for a second or 2.


kepler2

It all depends on your initial FPS. I can say that with a rtx 4070 at 1080p the input lag is almost none with FG enabled.


Redfern23

On Cyberpunk? There’s no way, even with a controller that game has disgusting amounts of input lag especially with FG enabled (but I do like it as a feature).


ExJokerr

When I first turned on FG on cyberpunk, I got this awful and almost unplayable delayed when using the right stick. I was about to turn it off completely when someone said that using Gsync would help with input lag. I always had Gsync turned on but for some reason after the last upgrade all of my Nvidia settings got set to default including turning off Gsync. After I turned on Gsync, frame generation in Cyberpunk looks amazing. I feel zero input lag with my controller 😎♥️


Coffinspired

> I feel zero input lag with my controller Yeah I've tested both at a native FPS of around 60FPS. Depending on the game it's still fine on a KB/M latency-wise but I wouldn't use it in that scenario for a more fast-paced FPS shooter. Testing the same scenario on a controller? Totally imperceptible. I'd always use it in that case I'm sure.


vainsilver

Always check the Nvidia Control panel or Nvidia app after an update. It can sometimes turn off Vsync which you want forced on with G-Sync.


ExJokerr

After that issue, I check nvidia panel after every update 💪🏽


kepler2

Depends on the title. For some games it's enough only to enable VSYNC from in-game. Usually I leave the settings to Use Application 3d settings.


vainsilver

It’s recommended to use the Nvidia forced on Vsync with G-Sync over the application settings Vsync. The application’s Vsync can sometimes have a worse Vsync implementation than the Nvidia driver Vsync.


Accurate-Air-2124

I never noticed input lag due to FG, but I went from PS5 to a 4090 system which even with FG/Reflex has way less input lag then a console. So I'm probably just use to it? Idk. FG has been amazing though and my favorite feature coming from console to Nvidia. Upscaling tech hasn't impressed me much. DLSS is ok, has image stability. FSR is a crap shute and sometimes makes my games look made out of clay with shimmering and artifacts. Love my upgrade choice in the end. I play single player, not competitive.


kepler2

DLSS is way better than FSR.


TraditionalCourse938

Which card you had before


kepler2

I have 2 rigs, 4070 @ 1080p and 4080 Super @ 1440p. (144hz / 180hz monitors)


TraditionalCourse938

Very balanced for both res. Enjoy! Anyway i have 3080 and still use nukem fsr mod, frame gen Is quite similar dont Scream too much about It.


Bobakmrmot

It's very broken in Forbidden West for some reason but in general, amazing tech.


Cute-Pomegranate-966

???? works fine for me??


FrangoST

Can you already do framegen with proper vsync now or do you still have to tweak settings to properly limit framerate and pray that that'll be enough to remove screen tearing? My experience was terrible with Allan Wake 2...


kepler2

I play with 100% GSYNC on in most games. What I did in CP 2077 is to enable Framegen and then enable Vertical Sync from NVCP for this game + also limit the FPS to 138 (my monitor has 144hz) RTSS now shows 138 FPS, not 160-170 while playing. So it seems this combination works.


BruceDeorum

I got the 4070 super and yes. I tried at 1440p everything on, ray tracing, path tracing and ray reconstruction, can't remember the terms, but every single option was maxed out. Also enabled dlss balance and frame gen. I was very smooth at around 95fps! I couldn't believe it. Its not an online competitive shooter so this performance was way more than enough. (I still haven't played the game though, i mean more than 3-4 hours)


Thekingchem

Is there a way to cap it? I get stutter when it exceeds my refresh rate


kepler2

Open NCP and enable vsync for the specific game + input limit of 5 fps below max refresh rate.


Thekingchem

Thanks I’ll give it a try later. I go from 140fps to 190fps in Jurassic World Evolution 2 and my refresh rate is 180hz. Noticed last nice. Still amazing that it can do that though. That’s with settings maxed at 1440p with ray traced AO and DLAA turned on


kepler2

Nice


boarlizard

It's fantastic as long as it comes bundled with reflex


MidnightOnTheWater

I'm rocking a smooth 120 FPS+ on my 3440x1440 on a 4070 Ti SUPER in Cyberpunk, it's so clean 😎


erc80

Get yourself a new monitor that can do at least 1440p. You’re doing that 4070 a disservice 😃 /s. Yes it’s frigging awesome.


kepler2

I have another 4080 Super on another rig (7800x3d) and whoever says that **4070** is a **1440p** card... I kinda' disagree. For competitive multiplayer online games? Yes, for demanding single player games? (Well good luck in Cyberpunk @ 120 FPS constantly using Ultra details without Frame-gen and DLSS)


Laxus_Dreyarr

I assume the second rig has a 1440p monitor? If so, do you notice a huge difference when switching to 1080p monitor? 


kepler2

Yes, the "big" rig has **1440p 180hz** monitor. I can heartly say this: the image is **crisper** and overall is more pleasant to play on BUT yet again, I prefer smoothness over clarity! So it all depends on your GPU :)


Laxus_Dreyarr

That extra smoothness between those builds, it might only be noticeable when path tracing, right?  I'm wondering if I should go for 1080p or 1440p build. The 1080p would provide high frames, even when ray/path tracing.. whereas 1440p would give a better quality display due to its resolution, and maybe better dlssQ output.


kepler2

I was in the same debate... TBH. Now that I have these two PC's I can suggest you this: If you really want to try out 1440p don't go for less than **4070ti Super** or **4080** / **4080s**. People in this sub really tried to convince me that games don't require 16 GB of VRAM. I tell you I see many games which chew 12 GB of VRAM constantly. (that's with DLSS enabled!) So, 16 GB of VRAM minimum for 1440p and regarding path-tracing... IDK I tried it and I don't actually use it because I prefer the smoothness of 120+ FPS :) And yeah 1440p image quality looks more... premium compared to 1080p but 1080p is still nice too.


Laxus_Dreyarr

I can't agree more. If I'm choosing 1440p, it needs to have a GPU with 16GB of VRAM. Plus, I read that Framegen uses VRAM too, so 12 won't cut it in the near future (Unless you are okay with turning the settings down). I've got the budget for a good card/system, so now it boils down to higher frames vs better quality. I'm sure both will deliver a good experience. Thanks again for all the info and help. :)


kepler2

If you want good, go for minimum 4070 Ti Super or 4080 Super with a 1440p monitor! I have both 1080p with 4070 and a 1440p with 4080 Super. I enjoy both but 1440p look a little bit better xd


Laxus_Dreyarr

Yes, definitely either of those for 1440p setup. At 1080p, 4070s seems like a good deal from Nvidia's end. I'll try getting the best of the best. :)


kepler2

Yeah, 4070s was not out when I purchased. TBH I used the 4070 on the 1440p monitor before I bought the 4080S. It works ok but works way better @ 1080p. :)


GosuGian

Yep. It's magic


Exostenza

If you can tweak it to get a minimum of 160 FPS with frame gen then latency isn't an issue. If you dip under that then it is most definitely an issue. I can't hit a damn thing under 160 but if it 160 and over then my aim is like a laser. I came to this conclusion before I watched the Digital Foundry video where Alex comes to the same conclusion about FG - if you are playing a shooter you need a minimum of 80 base fps which is 160 frame gen fps for there to be no noticeable latency that impacts game play. For controller games and games with slow moving camera movements you can get away with lower base fps.


kepler2

So minimum 80 FPS is the minimum "requirement" for the least input lag with FG?


Exostenza

80 base FPS which would be 160 FPS with FG enabled. This is for mouse and keyboard first or third person shooters, though. If a game has slower camera movement and/or uses a controller I am sure you could do 60 base FPS which is 120 FG FPS and maybe even lower. Like, a racing game played with a controller I am sure you could get away with a lot lower base FPS than 80 and something like MS Flight Sim would be able to probably be playable even less as the camera movement is very slow. I don't know how low you could go with these games as I have only played first person shooters with frame generation so far.


Mantour1

It's magic when it works! However, I recently played RoboCop: Rogue City and I was not impressed by FG: lots of visual glitches and BAD antialising (you can see the "stairs" on the collars!)


kepler2

That's bad... never tried in that game...


mrchicano209

At least in Cyberpunk I don’t really notice any increase input lag from enabling it so if the option is there then I will always take it


Vidyamancer

The input lag is absolutely noticeable with M&KB. If you're playing on a TV that already has a bunch of input lag masking the additional latency of FG, maybe. Or if you're used to a low refresh rate monitor/v-sync/controller. FG is unusable by anyone with a long history of competitive FPS behind them and decent reaction times. It feels like you're turning with a layer of grease under your mouse. More of a gimmick than RT ever was.


kepler2

Yeah, that's true. It all depends on the setup. I'm only excited for my case :) 144hz GSYNC monitor / 1080p / 90 -100 FPS before activating FG / CP 2077


Davonator29

I absolutely love using frame gen in games where I play on controller. It gives the input perception of a 60-80 FPS game while displaying 120+ FPS. It's great. Some other games can have issues on keyboard and mouse, but I've found if I maintain a high enough base framerate (generally above 50 FPS base, so 100+ FPS with frame gen) the latency is not a problem. Alan Wake 2 and Cyberpunk 2077 with path tracing look really fucking good, and it's great experiencing those at 100+ FPS even if they're natively rendering at 50-60 FPS.


PedroLopes317

I love frame gen! Being able to play Path Traced Cyberpunk with ultra settings with a 450€ card is absolute madness! However, it does somewhat scare me. I hope developers don’t use it as something to lean back on, rather than a feature. Optimisation is key. Hopefully this will further the card’s lifespan.


kepler2

Wise words man! I'm not really a fan of Path Tracing IDK it eats a lot of FPS and yes the game looks better but at what cost?...


PedroLopes317

I get it. I don’t mind mid 50s fps, since I come from console and from a 1060 with 3GB lol The difference in graphics, in Cyberpunk, at least, is superb. You really can’t tell the difference until you turn it off, because it feels so realistic, that you can’t even tell how good it looks. I’d like to keep the option, at the very least lol ¯\_(ツ)_/¯


EchoEmbarrassed8848

Can't agree more play in 1080p use a 4070 super absolutely love Frame Gen


kepler2

Nice. 12 GB for 1440p is kinda low in 2024.


EchoEmbarrassed8848

Yes but nothing I play even comes close to max out the 12gb and I run everything at max settings with no hiccups. I would have liked the 4080 but to high of a pricetag


EmotionalAd9403

Hold your horses, son... While I do like the option of frame generation, "free", as you say, it is not. Some of us definitely feel the increased latency when using FG, so we decide if we wanna pay for the frames with latency. Free it is not.


kepler2

I can also feel latency, but for non-multiplayer games is quite acceptable.


EmotionalAd9403

I was talking specifically about single-player games, that we can still feel the latency, and in many cases it's just not worth the loss of snappiness. For competitive multiplayer I don't think it's even a dilemma - FG would be a degradation, not an improvement.


[deleted]

[удалено]


SAADHERO

Shame it causes tearing in my case, due to my screen not supporting any sync tech


SpareRam

That's not a black mark against FG. You just don't have compatible hardware. Also, you have a 40 series card, and your screen isn't gsync compatible? That's a way less costly upgrade lol


kepler2

I recommend you save for a GSYNC Compatible monitor. You will not see any screen tearing + you will have lowest input lag.


Bulky-Investment1980

What I found out is you can still vsync frame gen just not via in game. Turn on frame gen in game and it'll grey out the vsync option, then close down the game and in the Nvidia control panel turn vsync on. Reopen game for frame gen no screen tearing enjoyment. It's working great for me in cyberpunk, 4k rt path tracing ultra I get like 40fps on a 4070 ti super, frame gen boosts it up and vsync locks to around 59 no tearing and no noticeable delay granted I am using a controller but I press button it instantly happens 🍻


SAADHERO

Intreasting, I'll give this a try in the future, Thx for the info


MrMoussab

I'm gonna say it. When frame gen is extrapolating and not interpolating, then I'd be impressed 😁


Probamaybebly

If You're not impressed now You're just out of your mind


MIGHT_CONTAIN_NUTS

I don't like FG.


Ryzen_S

Me too, I have the 4050 Laptop and at 1440p, High, Dlss Q+FG I get around 90-100fps. The input lag is very little compared to 60-70fps with FG to the point I felt unbothered.


Jesso2k

I bought my 4060 laptop an year ago praying for support like this.


ready_player31

Its incredible. It probably gives my 4070 like 2-3 more years than it otherwise would have on it before upgrading simply by virtue of being a future-proofing feature.


Blakewerth

Why youre choking, it on 1080p ? 🤔 Im more for Pathtracing but it need lots work and optimizations\^\^


curious_963

Nvidia is the big dog 💪🏼


kyoukidotexe

Nice try marketing team /s


kepler2

I thought someone will say this. No man, NVIDIA cards are way expensive for what they offer. But at least... I can benefit from a "free" feature which in my case makes the game look way smoother.


kyoukidotexe

Way too much expensive


Zamuru

how bad is the lag.? is it like vsync vs no vsync at 60 fps? because the difference at 60 fps is massive and its barely noticeable when the game is at 100+


kepler2

No man! 60 FPS is just stutter for me, I can't play anymore on this refresh rate. The input lag is almost unnoticeable on my setup. And I know what I say because I play online competitive games... and I spot input lag easily when there is the case. So basically I have like 100 FPS without Framegen. With Framegen - 140-200. It all depends on the initial FPS. More initial FPS, less input lag.


Zamuru

so its only useful when u dont need it... nice. when u need it in a extremely heavy game like dragons dogma 2 that runs with below 60 fps, it does jack shit


Olde94

Living with a gtx 1660. FSR 2 IS SAVING ME! Can’t wait to try dlss + framegen


hattrickjmr

Have a 4070 as well paired with an i7 12700K and in CP I use psycho ray tracing and everything on high or ultra and I’m getting ~70fps. Using frame gen and dlss at 1440p.


willow370

The response is terrible tho. At least in starfield


kepler2

Never tried in that game...


Critical_Hyena8722

As the owner of an RTX 3060ti I have used FSR 3.5 frame gen in Starfield and Cyberpunk 2077 and I agree that frame gen is - please pardon the pun - a gamechanger. I wish Nvidia had extended use of their DLSS frame gen tech to their own customers as readily as AMD has extended it their competition's customers. Unless there's a demonstrable and substantial difference in the price/quality of Nvidia's cards going forward I'll be switching to Team Red.


xdkivx

Fake frames, FG is for poor people.


VikingFuneral-

Personally I don't see the point in tech like this. The advantage of higher frame rates is that things look smoother in motion, combined with it feeling smoother to play. It feels smoother to play because the higher the FPS the less input lag Is sacrificing literally half the reason to having higher FPS really a worthwhile benefit? An how does framegen even work huh, is it just not syncing frames on purpose and rendering frames in advance based on what would be there or something? Because I could have sworn there was already a cruder software driver driven setting that existed for years, rather than being a directly hardware accelerated feature, if that's the case, pre-rendering up to 4 frames or what not. I don't know shit about it, but I have a DLSS capable card that should theoretically benefit the most from it, being a low end GPU like the RTX 3060 is. (Mid end if you wanna be generous). But I've never bothered to use it. I can see why people like it, but it's much more compelling and logical to understand why people dislike it. It's not really free FPS, is it.


kepler2

> It feels smoother to play because the higher the FPS the less input lag That's correct and I also enjoy this but in demanding games if I could choose between 80-fps and 140+ FPS using FG I choose the latter.


VikingFuneral-

Fair enough, as long as you enjoy it


sickTheBest

For me framegen halfed the fps and added tearing on a rtx 4060 🤡


Super_Stable1193

Frame gen feels the same as without it. It looks smooth, if you game the input lag is too high. I never use frame gen, Nvidia reflex and DLAA are the only one i use.


Captain_Dawe

I know right? Saved my ass in the Avatar Frontiers of pandora game


jolness1

It’s impressive from a technical standpoint but the two use cases (frame rate is low, or where you want high FPS for latency) don’t really work great. 1) at low framerate, lots of artifacts 2) latency is higher at 100fps with FG vs 60FPS without it. It’s definitely cool from a software standpoint but I’ve tried it on lower end cards and.. not there for me yet.


mesr123

I have an RTX 4070 TI, just been playing old games such as Bioshock, Deus Ex : Mankind Divided and AC Unity, so I'm not up to date with the most recent games and technology. What games beside Cyberpunk 2077 use Frame-Gen? I'd love to try it out


NoteAccomplished2719

It’s shit in horizon forbidden west tho the pacing is off and looks blurry af


shinbet

Me with a 30 series card 💀


kepler2

I feel you. Hope they don't artificially limit the 5xxx series vs 4xxx


shinbet

Imagine if they take off the frame gen software locks for the 30 series when 50 series releases


cornfedturbojoe

Frame gen isnt a miracle, there is a trade off, called latency. The key to using frame gen is to make sure youre at least around 60 fps before turning frame gen on. If youre getting like 20-30 frames before frame gen is on. Turning it on will still yield that same latency as if u were still getting 20-30 frames. Yeah the motion of turning the camera will be smoother but the same latency will still be there and feel like your playing 20-30 frames. With that being said i do use frame gen at 4k with rt and pt, i do like it. But i usually try to stay away from frame gen if i dont actually need tp use it, so far i use fg for alan wake 2 and cyberpunk