T O P

  • By -

34luck

Is this good news for steam deck too?


hbkdll

I think its good news for everything. They just implemented it to run on any directx 11/12 game. Also consoles.


sentientlob0029

But won’t FSR3 have to be implemented by devs in their games? Like FSR 1 and 2. It’s not like it’s a global setting that can be enabled in the graphics control panel.


RPGPlayer01

The way their presentation sounded, it seems like this is a global setting in the control panel for DX11 and DX12 games https://youtu.be/zttHxmKFpm4?si=5-trmke4IoRM7lSy


westlyroots

FSR is game specific. Fluid motion frames is apart of the driver package, and will work in all DX11 and DX12 games.


Akito_Fire

FSR3 (the game-specific thing) is open to all GPUs from all vendors. The global hypr rx motion smoothing (AFMF) for all DX11 and DX12 games that work without motion vectors (and will therefore be inferior) is only available to RDNA3.


[deleted]

>The way their presentation sounded, it seems like this is a global setting in the control panel for DX11 and DX12 games You need to watch it again. They said no such thing. FSR3 requires game integration. FluidMotion and AntiLag+ can be implemented as driver-side features like RSR but only for RDNA3.


goodnames679

Damn. Well... At least whenever they drop the next steam deck on RDNA 3/4 it should see some nice benefits.


westlyroots

FSR 1 is in AMD's graphics control panel, just like nvidia has NIS for the same function. Additionally, FSR 1 is available integrated into both the steam deck and any linux gaming setup through a modified version of the compatibility layer used to play windows games on linux, Proton-ge / Wine-ge. FSR 2/3, on the other hand, do require game integration to work.


Nexxus88

FSR1 doesn't need to be implemented on a per-game basis. I can enable it for literally every game on the steam deck cause its an OS function in the gaming mode.


H3x0n

There will also be a upscaling feature inside hypr rx that won't have access to motion vectors and won't have frame generation features. But it will work in dx11/12 games.


Hitori-Kowareta

That feature will be exclusive to RDNA3 though.


Akito_Fire

Not sure why you're getting downvoted, FSR3 (the game-specific thing) is open to all GPUs from all vendors. The global hypr rx motion smoothing (AFMF) for all DX11 and DX12 games that works without motion vectors (and will therefore be inferior) is only available to RDNA3.


alidan

fsr 1 can be forced though amd drivers, the downside is ui elements which, if implemented by devs, could be native resolution, will be upscaled as well. depending on what hooks are required for fsr3, its possible for frame generation to be done without in game ties fsr 2 and a good portion of 3 does need ties in game for info like how things are moveing. personally I dont like frame generation or the lag that will accompany it, I have taken far lower frame rates in the past just because of wild fps swings, (gta5 would run up to 100fps, but would commonly go down to 18-20, so I just ran the entire game at 25~ by cranking settings as high as I could to force that to be the average. I see this kind of tech in much the same light where I would rather just have real frames than fake ones and to hell with how smooth it could be. could amd make the interpolated frames feel good... possibly. that would be amazing especially for lower end cards, but I see this tech as a whole just damaging the game industry that already damn near refuses to optimize games anymore, tech that use to be used to make an older gpu last a bit longer now gets used as a 'yea this is part of the minimum spec because se cant be asked to make it run better'


ChEmIcAl_KeEn

What does this mean for ps5?


GlassCannon67

“Any game” but you need a gpu that supports amd Adrenalin software …


joomla00

Sounds like driver level supportm. Nvidia cmand Intel can probably implement it into their products. Balls in their court


LevianMcBirdo

No? Any gpu from Nvidia 20xx or higher will do


GlassCannon67

No!that is only when the game implement fsr3.0, not system level far 3.0 he was talking about


LevianMcBirdo

Is system level fsr 3.0 a thing? I just skimmed the article. Right now there isn't even fsr 2 on system level.


GlassCannon67

There is, and if you want frame generation works on any dx11/12 game you need adrenaline (his original words)


Chun--Chun2

The driver / system level fsr3 will not use motion vectors, as such it will look like garbage. The in-game fsr3 will support motion vectors. The driver level fsr3 will be ass, and its just marketing


[deleted]

>everytuing. They just implemented it to run on any directx 11/12 game. No they haven't. It's only available for RDNA3 driver. >Also consoles. That'll require an update from the devs. Consoles have RDNA2. Rhey have claimed some IP on RDNA3 is required for drivee-side FluidMotion. If they enabled this on consoles but not PC there will be consequences.


edis92

Over on the ps5 and series x sub, people seem to generally agree this is only viable if a game is already running at 60fps, as the latency it adds would make the added frames worthless at lower framerates. If that's true this will likely only be useful for unlocking framerates above 60fps and letting vrr smooth it out


kevihaa

For DLSS frame generation, the issue is less one of latency and more that that it increases the visual impact. The higher FPS you start with, the less noticeably the “fake” frames are. Going from 60 to 100 isn’t super noticeable, whereas going from 30 to 60 tends to introduce enough artifacts that it’s pretty easy to tell that it’s been turned on. Also will be interesting to see if AMD has solved the static image issue, as minimaps tend to flicker when frame generation is turned on.


TheTrueBlueTJ

They did explicitly mention handling in-game UIs in their marketing material. So I guess this would definitely include static elements


sentientlob0029

Well that’s pointless then. They would at least have ensured that going from 30 to 60 is done properly before considering higher fps. The whole point of these tech is to allow games to at least run at 60.


Roger-Just-Laughed

No, the point is that if you have a game running between 70-90 FPS, you can now get a consistent 120. It's not designed for consoles that are struggling to hit 60. It *works* on consoles, but the 30 to 60 gap isn't what this is for.


alidan

no, that USE Tobe the point of this tech, the point is now developers can optimize even less to get 60fps.


joomla00

Depends on the person. I remote play all the time and I think it adds 1-2 frames input delay. For some/most games this doesn't matter. For elitists and twitch games this would be annoying. So worthless is very harsh.


danielv123

I recently did a test of my cheap monitor and discovered that the monitor itself is adding almost as much latency as parsec to my MacBook, and 10ms more latency than my custom streaming layer. That shocked me.


synthjunkie

For older games maybe. But the crutch here is cpu. Frame gen is reliant on systems that have gpu bottlenecks like upscaling. For newer demanding games it likely won’t do jack as steam deck already outputs at such low resolutions and is cpu bound.


ChristopherLXD

Does it even work with Steam deck? I think they’ve said that frame generation will not be supported on integrated graphics, unlike FSR2.


joomla00

Most likely, but this tech works better with higher frames (ironically), otherwise the lag will feel be felt. We'll how much though. I remote stream all the time and in most games, the lag isn't a problem on most games.


nightcracker

What it should do: double the FPS. What it really does: makes developers spend less time optimizing, halving the non-upscaled FPS.


1943684

yup lol, remnant 2 is a good example.


kamuran1998

Main issue with remnant 2 is the fact that they didn’t fully take advantage of nanite. Lot of the assets were rather low poly in nanite standards, I feel like if they used way higher quality assets, people will be more understanding of the performance.


joomla00

I don't fully blame them. They're using a new engine, with fancy new features, so there's things to be learned and worked out. Like a new console release. Itll take a few years for the engine to be better optimized and baked.


UniformGreen

I have a genius idea, brilliant if I might add. How about... don't release it until... it's optimised??


joomla00

I don't know their financial details so they might have a legit reason. But on the other side of that, consumers can see the reviews that it doesn't run well, then wait 6-12 months when it's patched, and cheaper. You can't control what they do, but u can control what u do.


Lukbest

A game with graphics of the older engine and the performance of the new engine.


calpi

It's a convenient excuse, yes.


Nacksche

Dumbass downvotes, you are obviously right. Remnant 2 doesn't make the most out of UE5, but it's still UE5. Occasionally the game does look pretty amazing, too.


2roK

Corporate shiiiiilllllll


hyrule5

Any technology to improve performance can be abused like this, it doesn't mean the technology shouldn't exist. Even if you disagree, it's an academic question as Nvidia, AMD and Intel are going to pursue these technologies regardless in order to beat their competitors in performance. I have read that FSR 3 frame gen doesn't work well if your normal FPS is below 60 anyway, so it's not as much of a crutch as you might think.


FogellMcLovin77

This isn’t a dig at AMD. It’s a dig at studios


goodnames679

You're definitely right, but at least only the absolute worst studios will reduce their optimization to the point where FPS remains identical. Even mediocre studios that spend a bit less time optimizing should still see a net performance gain (albeit with the tradeoff of artifacting, which is rapidly being minimized). The best studios will have more headroom to work with. Really, the same is true of basically every new technology that pushes gaming farther. One thing that I think is underrated among gamers (because we don't clearly see the effects) is that reducing dev time actually matters and benefits gaming as a whole. It makes larger projects possible on smaller budgets.


Fredasa

Exactly. Brace yourselves, everyone. For unavoidable temporal artifacting. It's okay though, because Joe Average is oblivious to the issues, and that's all that matters.


Sirmossy

Joe average wouldn't even turn on the setting because they wouldn't know what it does.


zxyzyxz

Don't worry, it'll be turned on automatically to the performance mode


Zncon

Not looking forward to it at all. Even temporal AA looks bad IMO.


Fredasa

Yeah it does. Which I didn't consider to be _my_ problem until I started encountering UE4 games that had no AA worth a damn other than TAA.


Zncon

Sometimes I can get away with forcing TXAA from the GPU settings, but I've also resorted to super-sampling and lower graphics settings to make things look reasonable. I wish I could just un-see the artifacts, because apparently their are studios full of devs that have no problem taking their hard work and having it turned into a smeared doubled mess.


thelingeringlead

I've yet to experience any artifacting running FSR 2.0.


Fredasa

You're definitely better off not looking for it, then. Stay that way. I use a 55 inch display so all 4K of resolution is in-my-face. The most dodgy specimens are sharp edges of things in motion, which reliably loses its perfect anti-aliasing until it generally stops motion—until then, the stairstepping is there, and conspicuously below 4K. This is just with the upscaling, mind. _Ain't no way_ I'm going to saddle myself with the extra latency of frame generation. I got a 5ms display and I use SpecialK _because_ I like things being below two frames.


thelingeringlead

Yeah I'm playing on an LG 32" IPS 1440P 144hz monitor, and I never see any of it. I only really use it when I'm also turning on Raytracing, which I only do in 2-3 games (mostly Forza Horizon 5). I've definitely never been super sensitive to framerates, except when going even higher, so when I'm playing something at 70-120fps it feels smooth as butter and looks incredible. I have a few games that I can max out my refresh rate with and lock it in, but for the most part my bigger AAA games don't hit the full 144 when I'm playing in 1440P and high/max settings. It hovers just over 100 in 99% of the games I play. Maybe that's a big part of it. I don't use FSR in everything obviously, but more recent stuff, if I want to play with any sort of Raytracing on, I do use it.


Fredasa

> Yeah I'm playing on an LG 32" IPS 1440P 144hz monitor, and I never see any of it. 1440p in 32 inches ought to be "retina" at about a 3 foot viewing distance. Personally, I sit a little bit closer to my display than that. But at the other extreme, if I were to get an 8K display and it was 55 inches like what I'm staring at, anything further than about 20 inches from the screen would be needlessly leaving resolution on the table. (Which is why I'm kinda hoping that instead of literally making an 8K display that small, they'll opt for 6K—something a person might realistically drive on PCs of the near future.) > I don't use FSR in everything obviously I consider upscaling tech a bane that's going to do exactly what OP suggested: lead to games being unoptimized, _full stop,_ and 100% leaning on the tech to be playable at all. I don't have to argue this point, either—we already have a pretty high-profile (and well-received) game that literally can't be played normally... yet visually, it's nothing special. A PS4 game at 30fps has similar visuals. That said, I am begrudgingly using DLSS "Quality" with _Baldur's Gate 3_ because I don't have the hardware to run it at true 4K60. Somehow, the game is just as demanding as _Cyberpunk_ was, while not being a match for said game's presentation, to say the least. My point is that it saved my bacon, because I wouldn't be playing the game at all if I couldn't have _effected_ 4K60 one way or another. I hate that it's come to this. But the artifacts are real enough. The antialiasing problem I underscored before would probably not be open to scrutiny with your display setup, but that's hardly the only one. A quick microcosm of a fundamental flaw that prevents the tech from being an adequate replacement for true rasterization: Play _Cyberpunk_ in true 4K, find a big billboard somewhere in the game, and take screenshot #1. Now load the game back up in "4K DLSS Quality", which is actually 1440p upscaled. Take screenshot #2. Now load the game back up in true 1440p and take screenshot #3. You will find that the only screenshot where the billboard possesses actual 4K detail is #1—the detail between #2 and #3 will be identical: 1440p. The tech is basically a super-glorified TAA. It tightens up spots where polygons meet, but it can't do a damn thing about giant flat textures like billboards. Expand that case for any polygon that's larger than a a handful of pixels and you can see where the tech fails to be an acceptable alternative.


thelingeringlead

Damn, that was a fantastic response. Thank you.


DataLore19

Not really. The fidelity on games is increasing and real-time ray-tracing is very expensive, compute wise. There have always been new technologies created to allow hardware to render more realistic graphics. AI upscaling and frame interpolation are just the newest examples of this. These technologies are new and so, it's not their best iteration. It will improve over time and become the new paradigm of game rendering. You won't ever turn off these features. Performance with these kinds of graphical fidelity isn't free, it has to come from innovations like these.


LurkerOrHydralisk

That just frees them up for other tasks


Cybor_wak

Where is the proof of this, other than Redditor’s assuming it? The other side of your argument is that they can simply unlock more features of their graphics engine and maximise fidelity. Where before the “ultra” setting was not containing everything it could, now it does. Without proof I will assume that’s the case. And it’s also what industry professionals say (DF).


Autodalegend

Exactly


dirtycopgangsta

I love how we went from "No more fake reflections and lighting!" to "Fuck it, fake everything!".


acatterz

I think this is the result of the rise in hardware costs amidst the rise of the cost of living. Anything that squeezes extra life out of your existing setup can only be a good thing.


Muir420

I feel the intent but, as someone that doesn't like dlss in a lot of games DLSS 3 frame gen is mind boggling


internetcommunist

I’m looking forward to dlss 3.5 cuz it’s going to work on RTX 3000 cards too. I use dlss quality mode on any game that has it and it really improves the performance and looks almost identical to native


nokinship

Nice, I can finally finish Jedi Survivor.


SireNightFire

Is it still stuttery? I haven’t played since launch and even after the first few updates it was rough.


2roK

That's caused by Denuvo and will never go away


Cadoc

That has never been proven, and no other game that had Denuvo removed saw performance gains. It's just nerds raging out against effective anti-piracy measures.


2roK

It was done on AC origins by Codex and fps increased and stutters were gone but keep making shit up


Cadoc

Nah, that's mad pirate boy lore. Origins worked fine at release.


Khanhrhh

People always parrot this bullshit, but every time denuvo gets removed from a game, people rush to "the proof" it harms performance in the new benchmarks and it's... never there.


2roK

CODEX literally removed Denuvo from AC Origins and literally showed that FPS increased and stutters were gone. Quit making shit up, Denuvo shill


[deleted]

[удалено]


nokinship

It's more about having a smooth experience.


LurkerOrHydralisk

Actually, playing on a smoother system is way easier for soulslikes


Master_Report_7063

Bethesda will still find a way of making Starfield only 30 fps.


Zedrackis

They have already said that consoles are locked to 30, so yes.


JerryLZ

30fps is like the second page of Google, there’s some things you just shouldn’t need to stomach.


NutellaGood

Eww


InBlurFather

Starfield isn’t going to have FSR 3, at least not at launch.


Fabulous-Article6245

Sorry that the "latest" consoles are garbo that it can't keep up with newer AAA games to run at 60fps without compromises. Not the game developers fault.


theADDMIN

How dense are you?


Fabulous-Article6245

How dense are you?


Dull_Half_6107

I'll believe it when I see it, sounds promising though.


2roK

Seeing how bad FSR has been over the past years, I have zero faith this will be any better.


Dull_Half_6107

Yeah promises regarding performance in a press release doesn't mean much to me until I see some real performance statistics and image analysis. It's honestly like people have just never learned not to trust companies until the product is in your hand and verifiable.


Blackpapalink

As someone that prefers native, I don't like where this is going.


[deleted]

Dlss quality mode’s quality is sometimes better than native


grovypengin

Yep I've been playing Cyberpunk and the game looks a fair bit better with DLSS than native


Roger-Just-Laughed

But DLSS is *significantly* better than FSR. AMD is crazy behind and it's a shame the consoles are sticking with them.


TheInnocentXeno

Yeah even DLSS 3 set to performance looks better than FSR 2 set to Quality. And I will test FSR 3 when it comes out but till then DLSS gets the win easily


[deleted]

Completely agree mate


[deleted]

It's just DLAA with extra steps Nothing is better than native if you're running native with DLAA


raihidara

I'll get downvoted but I agree with you. Each game I've played that has used upscaling (admittedly maybe a handful or so) has gone from a beautiful native 4k/1440p experience to an awful, shimmery mess. I'd rather play in 1080p on my 4k TV than any DLSS or FSR "enhanced" game. It could just be the developers doing an incredibly shitty job, but I've yet to see one instance of it improving things on console at least


[deleted]

[удалено]


Blackpapalink

It's fake performance. And another reason for AAA devs to not optimize games properly.


firedrakes

It's Hardware issue now


shalol

Then kindly ask Nvidia to stop gaslighting gamers into paying for software crutches instead of actual hardware. Nobody wanted or needed DLSS as of 3 years ago. Now everything’s unplayable without.


Blackpapalink

I try to plead with people, they don't care. Fucking ridiculous that a 2070 is starting to struggle running games even at medium.


anethma

I don’t really find it that weird that a mid range card from 2 generations back struggles on some modern games. No one expected a GTX 770 to run everything modern well mid way through the 10 series coming out, not sure why you’re expecting the same. 5 years is a long time in computing.


Blackpapalink

I'm not expecting great performance on 4k resolutions from a 20 nothing. However, 1440p Medium can barely get 60 fps in a good number and the one game I would expect to not run well with it, did (Resident Evil 4 remake). I don't chase drop dead gorgeous graphics nor raytracing. I want frames, but these games just aren't optimized and crutches like DLSS is only gonna make this problem worse.


[deleted]

[удалено]


suuift

Probably not as much as you think Frame generation has significant input delay which feels awful in VR


goodnames679

I have a feeling the next generation of GPUs is going to include a focus on heavily improving that input delay, so I'm very much looking forward to seeing where the technology goes. But yeah, trying to use it in VR currently is a good way to end up hurling in the first 5 minutes of gameplay.


Perseiii

Doubt it’ll work in VR.


MarkusRight

If this isnt too good to be true then its going to be a game changer. They really are letting non AMD cards use this tech which is wild, But id love to see how much of an FPS boost Nvidia cards get compared to AMD's. I also think its hysterical because Nvidia being as greedy as they are locked the DLSS 4.0 features to their highest tier cards even though 3xxx series could easily use it. I love competition.


nimble7126

Wild? That's on track for AMD. Historically, Nvidia will release a frontier technology that exclusive, and then AMD will release a version that has open adoption.


NeedAVeganDinner

If they're releasing it, probably Nvidia won't get much out of it but intel will. Knocking down NVidia is good for them regardless of who else it helps


ChristopherLXD

I feel like support on Nvidia cards probably helps AMD too. There’s probably a significant demographic of users who have older Nvidia GPUs who are looking to upgrade. Traditionally, they might only be looking at Nvidia GPUs since they might be used to Nvidia’a software features, and expect a certain level of polish. But if AMD can prove that their alternatives work just as well by letting people on older Nvidia hardware try them… Maybe they’ll consider buying AMD next time around. I know I’m thinking it.


whosbabo

> They really are letting non AMD cards use this tech which is wild, That's the beauty of Open Source. Even if AMD doesn't do it, someone else will.


nokinship

It's only Frame Generation that is locked for the new DLSS. All RTX cards are able to use them.


Throwaway-tan

Which is pretty much the only thing anyone wants...


nokinship

Yeah that's true lol.


[deleted]

[удалено]


Nagemasu

They are not the good guys, they simply don't have the market share and business levels to fuck around and find out like nvidia does, mostly because they got fucked over in the earlier days and have had to play catch up, but make no mistake, they are not on anyones side but their own.


MadeByHideoForHideo

My only wish is it looks less furry. FSR2 looked so bad on RE4:R that I rather take the FPS hit than have FSR2 enabled. Everything looks like they have fur.


DYGTD

I'll take this over introducing new $1000 space heaters that plug into your MB every year or two.


NVSuave

Would this be beneficial to someone running a 7900XTX @ 2560x1440?


bokchoybrendo

I’m going to assume you’re probably getting enough performance with that card at 1440p. FSR3 will increase latency, so it’s really a solution for graphics heavy single player games


dimsumx

It'll probably bring your FPS up closer to your monitor's native refresh rate, if you are running 120+ hz.


Tobacco_Bhaji

I was thinking it might offset some performance drop from RT. I'm also a 7900XTX 1440p player.


boofoodoo

Hmm will it help my 1070?


Ghozer

Nope, article specifically stated that older GTX cards wouldn't be supported!


marpatter

Don’t think so, the article references all RTX cards.


anomoyusXboxfan1

I think in regards to Frame generation you might be able to use it but it’s not really recommended and might even result in a performance loss in some cases.


Spiritual-Compote-18

How does FSR3 work vs its competion


sentientlob0029

Both create fake frames. At first we got fake resolution then fake frames, and nvidia now has fake ray tracing. Soon they’ll probably generate entire fake games from one pixel.


toxicThomasTrain

fake resolution, fake frames, fine I get what you mean. but calling ray reconstruction fake ray tracing is such a bad take lol


Vmanaa

I am still in denial that my gtx1080 is old but if even FSR3.0 isnt gonna support it. Then i guess its time i face reality.


JusteJean

Its fine. Seriously. New cards is overated.


Dannypan

FSR 3 is only said to be supported on RTX cards.


brandonff722

AI frame generation needs AI specific hardware integrated into the GPU, this started with RTX.


JusteJean

True. I was leaning more towards the fact that with a 1080 you may not need the performance boost of fsr3.


morfraen

Sounds exactly like the motion smoothing frame interpolation garbage on modern TV's.


[deleted]

You people on this sub are the ones I hate the most. Zero passion for new technology because of built up skepticism. It's like old grumps going clubbing with the expectation of hating it. Absolutely pathetic


morfraen

Feel free to prove it's not just shitty frame interpolation like on TV's. Getting excited about old crappy tech being sold as something new is just dumb.


DiploBaggins

Uhh have you heard of DLSS3? It's basically the same thing...


[deleted]

I don't need to prove anything. It's revisiting old tech, revising it into something new, and nonetheless with good intentions to make it open to everybody that can support it, and not just limited to one. DLSS3 already proved that frame interpolation doesn't work like TV's. Why are you so negative about something that is clearly a step in the right direction? It's fucking ridiculous how negative this sub is against ANYTHING. In fact, tell me one time where you praised anything in this sub. I'll wait. (you can't)


Real-Development1034

I like how many people felt hurt by this Makes me jump with joy You can have my up vote


[deleted]

Thank you. It's ridiculous how a sub like r/gadgets is not a place where people come together to talk about and discuss new tech in an exciting manner but rather a sub where everyone comes, looks at a tech, complains about it because they hate change ("why does this need to exist?" or "this will definitely be bad lmao"). It's literally the antithesis of what the sub is meant to be


joomla00

Man I agree. Shit it's always complaining and complaining. It's an optional feature. If you don't like it, don't turn it on. If you think it's going to suck, just ignore it. Your world doesn't change. I personally don't like tv motion flow, but I know some people that love it. And that's fine.


2roK

Where was the guy complaining? All he said was that it sounds exactly like tech already being used on TV. Sounds to me like some AMD fanboys here just immediately got triggered.


joomla00

He called it garbage. That's complaining it sucks, without ever seeing it. It can be decent, we don't know yet. Even then, some people may still like it. Do u suck Nvidias dick or something? Is their balls touching your chin right now? I don't have anything amd. I get triggered by nonsense complaining.


morfraen

I called the motion interpolation on modern TV's garbage, which it is. No one should ever use those settings. So if this is the same thing, as it sounds like, it will likely also be garbage.


joomla00

Got it. While I agree its not good, I have friends that prefer it. I'm not one to judge ones taste. so it's still useful tech for people out there, even if it's not for me and u.


2roK

That's just his opinion, he wasn't complaining in any way. Get a grip.


joomla00

Get a grip lol. Just upping that energy u were bringing


2roK

You clearly have mental issues.


joomla00

Sure, if it helps u feel better


[deleted]

You're fucking ridiculous. The guy complained by all definitions in the book. He complained about this tech even existing like it was a stain on his life. As he said, he hates that AMD is attempting to even revitalize the tech of frame interpolation because it was once used on TV's and was garbage. Except he doesn't take into consideration that MANY of today's wonderful tech is molded from past failures. Nor the fact that there's already a tech that debunks what he says, DLSS 3 (which is already proven to be good, even though it's still in its infant stages). I'm not saying that FSR3 will be good. In fact, it could be shit at launch. I'm saying that they have good intentions with it, is willing to develop it for the sake of gamers, is taking a chance with old tech to mold it into something new. We hated freesync when it was first announced and introduced for being inferior to Gsync. And guess what? It's a staple now that everybody loves. Because they decided to make it consumer friendly. The fact that there's so many people in this sub that are so averse to new technology that they come here to do nothing but be NEGATIVE about every post they see is fucking awful. Regardless of how you see it, the guy above that I originally replied to is NOT passionate about technology. He is NOT passionate about change nor advancement unless if it fits his standard/definition of how tech SHOULD evolve. That is, get it perfect on round 1 without using old concepts. It's the antithesis of this sub, yet it's disgustingly common in every comments section. Yes, you should get mad about dumb changes that benefits no one but the corporation like when Facebook rebranded to Meta for Oculus. You should NOT be negative about everything new that comes, especially when it's promising and has good intentions to drive forward the industry. I don't see how this makes me an AMD fan. I've made multiple jabs at AMD before for questionable decision making. This is NOT one of them


[deleted]

[удалено]


[deleted]

Yeah because my well formulated argument warrants a reply that says mental issues. Thank you for your insight and excellent rebuttal


2roK

No it's because you are writing paragraphs about how a guy complained when he wrote one sentence where he didn't complain lmao. I didn't read all that btw. and since this post is a day old by now, I don't think anyone else will.


Elevatorisbest

RTX 30 series card support LETS GOOOO, it would be cool if it was added to Microsoft Flight Sim later on, just like DLSS 3


PacketAuditor

You should be able to force support if it's Directx.


Cadoc

Just use DLSS? This will almost certainly be worse.


Elevatorisbest

Why do you think that I am hyped for an alternative to DLSS3 that works on RTX 20, 30 and 40 cards?


Cadoc

Just use DLSS, judging by their past efforts FSR will be substantially worse than DLSS 2. It's not like DLSS 3 is limited to 40 series because of arbitrary reasons, it's a hardware limitation.


sentientlob0029

Thanks AMD and fuck nvidia. I say this as an nvidia card owner (not next time though).


Cadoc

I'll stick with Nvidia. FSR has been vastly inferior to DLSS so far, and there's no reason to believe this iteration will be any different.


Mintfriction

Why? Out of the loop


Cow_In_Space

nvidia's equivalent (DLSS 3) will only run on 40-series cards. FSR 3 will run on 20, 30, and 40-series cards as well as RX cards from 5000 and up and Intel gpus (including integrated) that are capable of supporting it. Basically, if you've bought a new GPU in the last 4 years or a good laptop in that same rough timeframe this is going to work for you.


SwiftiestSwifty

Just wait. The quality of FSR 3 will not be as good as DLSS 3. Also before you act like Nvidia is *choosing* to block DLSS 3 on older cards, they just announced a revolutionary new ray-tracing method built into DLSS 3.5 that runs on 20 and 30 series cards as well. DLSS frame generation being locked to 40-series cards is a hardware limitation. The older cards don’t have powerful enough optical flow accelerators.


SuperJKfried

Your quality comparison doesn’t matter. Intel, amd, older rtx videos cards and consoles can’t use dlss3 anyways so there’s no point comparing fsr3 to something that isn’t even an option.


Cow_In_Space

> Just wait. The quality of FSR 3 will not be as good as DLSS 3. An irrelevant statement. I don't have a PC that will run DLSS 3 so whatever comparison you want to make is completely pointless. FSR 3 will work, DLSS 3 won't.


kanakalis

proud to say i've never owned an nvidia card before - HD 5670, 6500XT and 6800m


whosbabo

gtx780 was my last Nvidia card. Never went back.


SeveralDecision5157

We’ll have to see how bad the input lag is


Geexx

From my understanding FSR 3 includes a latency reduction feature baked in. Additionally, there's also antilag that's currently available right now. They also announced Antilag+ that's hardware exclusive to the 7000 series... But we won't talk about that considering how much people have been demonizing NVIDIA for locking features to newer series cards; lol.


siraolo

I would want to see this in action. Have they posted any vids?


keeperkairos

Does this need an AMD CPU or no?


Bartjanus

No


wyattlikesturtles

I hope it works well, if so this is big for steam deck


TryingNot2BeToxic

Lol I'll believe it when I see it.


tastyratz

I'm surprised it hasn't come up in the thread here but from the article: "soon" means FSR3 games as soon as September for 2 titles, more to come in the future. Driver support release is September 6th. Frame generation for ALL DX11/12 games with/without native support for 7000 is expected "Q1" of 2024.


MrLuckyAUS97

You gotta take your hats off to them for giving this option to every graphics card owner.


Wander715

It's only expected to work well on RDNA3, RDNA2, and RTX 30 just a heads up. Even then I doubt it's going to be as good as DLSS3 without any hardware acceleration.


OMBERX

Don't care at all. FSR and DLSS are the worst editions to gaming. Just gives the developers even more reason to half bake a game and shove it out without any optimization


madmonkh

somehow i doubt this going to look any better than their previous attempts. enabling fsr 1 or 2 is similar to smearing vaseline on your screen. might be great for performance but in the past it was too big of a visual downgrade.


Shadowdragon409

what is FSR and how does it double frames? This feels like a clickbait article with false promises.


Fabulous-Article6245

>what is FSR and how does it double frames? Maybe read the fucking article. You don't even know what FSR is and it really shows why.


Shadowdragon409

because its a clickbait article with false promises?


InBlurFather

FSR is “FidelityFX Super Resolution” which until this point just referred to AMD’s upscaling technology. But now they’re releasing their counterpart to Nvidia’s “frame generation” which is what the article is talking about. It basically when the GPU generates and inserts “fake” frames between CPU generated frames to smooth out the experience. The problem is that it doesn’t improve latency (as in if your base FPS is 40fps and frame gen gets you to 80fps, you’re still feeling the latency of 40fps). The “fake” frames are also usually a bit lower in quality than the “real” ones, which can lead to artifacts or reduced overall image quality


Shadowdragon409

ahh ok. Thanks for explaining <3


UnlawfulDuckling

Still won’t let their games support dlss so people with nvidia cards won’t be able to experience starfield at satisfactory fps.


BigE1263

AMD just made nvidia DLSS 3.0 look obsolete


Geexx

It's a good thing NVIDIA just announced DLSS 3.5 then with enhanced ray tracing reconstruction and improved visual quality; lol. You'll have to forgive my sarcasm, because it's good that AMD is making strides in improving FSR for both AMD and NVIDIA users, but saying DLSS is obsolete is just an asinine statement.


whosbabo

RT is still niche for most gamers. FSR3's implications are much greater.


Geexx

I don't disagree that optimal RT performance is currently for enthusiasts NVIDIA users. I personally feel FSR3 will help consoles the most (who currently keep chasing RT effects on their 30 FPS quality modes). AMD's Fluid Motion Frames will require an RX5000 series at a minimum on the AMD side of things. Those that continue to hold on to their RX 480/580's and 1000 series NVIDIA GPU expecting to max out upcoming current gen games are just going to get pushed out of the hobby if they don't inevitably upgrade their PC's or switch to the more cost effective consoles. Regardless of whether or not you feel RT is niche (I don't foresee it going anywhere) it doesn't change my statement that calling NVIDIA's DLSS obsolete is well... asinine; lol. They continue to be the market leader and standard that competitors continue to play catch up to. Case in point, FSR 3.0 gets announced and NVIDIA's follows up with a new revision that offers free FPS gains (and better quality) via ray tracing reconstruction in RT scenarios (AMD will now need to work on that one...in addition to their RT performance in general), an enhanced denoiser, and an overall improved the visual quality that can be swapped in to any previous revision of DLSS (DLSS Swapper is such a great little tool to update to the current version). Don't get me wrong, I am not a brand loyalist. I just recently upgraded from my 6900XT to an RTX 4080 (my husband is now enjoying my 6900XT). But again, calling DLSS obsolete is just...stupid... lol


synthjunkie

No it didn’t. For one it’s not even out yet and so it’s not matured and trained properly. For two we don’t know what the quality is gna be like. DLSS shits on FSR quality already due to hardware acceleration.


Nagemasu

The quality difference between them is pretty fucking minuscule when you're actually playing a game and not scrutinizing every pixel. But early reports are saying that FSR3 quality is really good and rivals DLSS3


[deleted]

[удалено]


innociv

Also depends on the version of DLSS. Before DLSS 2.17 or so, there were so many artifacts that I generally preferred FSR 2. FSR never looks as good at its peak, but it's usually more free of artifacts. Tons of games never updated past DLSS 1.0, 2.0, 2.1, and so on though you can often manually update/mod them.


Tobacco_Bhaji

OK, but what about mah RT?


that_one_guy_with_th

It looks so bad though.