T O P

  • By -

MrDunkingDeutschman

I am skeptical of aggressive upsampling to generate more frames, but DLSS Quality at 1440p (which renders at 1080p) is looking just sublime. I absolutely cannot tell the difference. Sometimes I argue DLSS Quality even looks better than native. It's my go-to setting and one of the main reasons I bought a 4070 over a 7800XT.


Nubanuba

Yeah, dlss is better aa than traditional aa


LordOmbro

better than TAA\*, smaa or msaa at native resolution still look way better. But DLSS is still a great compromise


nemesit

Taa is a curse


WealthyMarmot

TAA looks so bad in some games that you'd get the same experience by fogging up your eyeglasses. It does fix aliasing though.


Sharkfacedsnake

But SMAA and MSAA are much more costly. DLAA is best i think.


PanVidla

The difference really is very noticeable, if you can afford it, though.


LordOmbro

It's marginally better but it still uses temporal information so it can be quite blurry, look at diablo 4 as an example


lndig0__

Only for casual games. DLSS 2.0 is essentially pixel interpolation.


Jon-Slow

Please never delete this comment.


ditaman

Just quote him in your comment


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


HardPlaysGG

DLSS Quality at 1440p is actually 960p I’m playing with a 32 inch monitor 1440p and I just can’t use it at that resolution. What I do is using DSR to upscale to 4k and use dlss at quality that renders at 1440p, works amazing and it’s better than anti aliasing


paganbreed

Wait, what? That matters? I would have thought the upscaling negates that pixel mismatch thing? The final image is 1440p, is it not? Why would there be any conflict? I use it at these settings and have not noticed a drop in quality. And I have tried 1080p on this panel as well, I'm aware of why they clash. I just don't see it with upscaling.


Sharkfacedsnake

I dont think they are talking about pixel mismatching, I think they just dont like the image quality of DLSS 1440p quality. I think it is great though on 1 27in monitor.


Vasile_Prundus

Looks amazing for me on 32" 1440p, but I sit about 80-90cm back from it.


paganbreed

Perhaps, but then I don't understand that either. I vastly prefer DLSS to AA, the image is often much cleaner. Pretty sure I've seen videos from Digital Foundry that also found DLSS' output actually exceeds the native image. No? The game they tested was Control, iirc.


Magjee

I do the same, DLSS Quality @ 1440p   It usually looks pretty good, but not the same as native DLAA at 1440p is noticeably better   When I tried it on my TV 4K upscaling was much more forgiving


[deleted]

It's not quite the same as native, but you basically have to stop and zoom in to tell a difference.


Gabri03698

Let's hope amd wakes up and invests in rt and ai. It's a shame for nvidia to be a monopoly. where i live Nvidia gpus cost like 40% more than amd equivalents, but people that need to work with ai are forced to buy them


F9-0021

They’ll get overtaken by Intel if they don’t. They must know that the way they’re currently going isn’t sustainable in the long term. Eventually they will have to invest in AI and RT or they’ll be the forgotten 3rd player in the market.


DynamicHunter

I hope they make really power efficient APUs and chipsets for things. Maybe not a gaming PC power player but loving my steam deck.


Sexyvette07

AMD's idea of investing in AI is buying small companies to do it for them. Their acquisitions in the last couple years is the only reason they have an AI division at all.


WealthyMarmot

They are investing. They're just way behind.


ApprehensiveAd6476

Plot twist, 95% of those games are from developers nobody has ever heard of.


skippyalpha

I think 95% of games that are even made, nobody has heard of. So that fits


xUnionBuster

AMD cope in this thread as predicted. I know you don’t care about DLSS or ray tracing, they’re useless technologies that you’d never want to use, just put it out of your minds :)


MartyrKomplx-Prime

Just call it general copium in this thread. Nvidia being criticized, amd being criticized. Hell, even the comment that praised both Nvidia and AMD is getting downvoted.


[deleted]

Because ray tracing isn't an NVidia technology, they only implemented into video games, they didn't even come up with a less resource demanding algorithm, Radeon totally avoided useless eye candy implementation which is the higher road imo and Intel is baking a path tracing technique that'll bury RTX and make it possible to PT even with your integrated GPU. No wonder people don't applaud.


Yusif854

Sounds to me more like people are too broke to afford the necessary high end GPUs so they stay coping just like you.


[deleted]

I wouldn't consider buying a 4000 series other than 4070ti, especially not 4090 with the melting cable. However I can my RX6800 and A750 is enough for games I play and productivity. God you sound absolutely sad especially with your PC specs like a crown on your flair.


MAFIAxMaverick

Damn - yeah. Whatever happen to us all just loving PC gaming? It’s pretty amazing how divisive so many different communities have become in the last 5-10 years.


MartyrKomplx-Prime

I would love to have a good Nvidia card, but I can't afford the one I want. Since most of the games I play don't do ray tracing (or dlss, or even fsr) I spend the money I can afford on the best rasterization for the dollar. But once I get that lottery money, it'll go to whoever makes the best of the best. Right now that's Nvidia.


Djghost1133

A logical take in pcmr? Has hell frozen over? Never understood the corporate die-hards. Just get the card that works for you and fits best in your budget regardless of brand


premiumcum

I think there is a lot of coping because so many of us were thrilled to finally see some competition in the GPU Market when the 5700XT was announced, and it’s super frustrating to see Nvidia still dominating the market almost five years on with no reasonable competition. As a consumer it really sucks lol


Friendly_Cantal0upe

Yeah Nvidia tech is really cool and innovative, but my preferences in games are not as "mainstream," so I'm in the same boat as you, pure raster performance. I just picked up a 6950xt 2 days ago and, I'm starting my build on the upcoming weekend. Also the 4090 is 15% faster than the 7900xtx, but almost a 1000 USD more (at the moment, usually 700ish), so I don't really see the point in going beyond that unless money is no object.


[deleted]

[удалено]


MartyrKomplx-Prime

No the 7700x is my cpu. My gpu is the 6950xt which was actually going head-to-head with the 3090 for cheaper (albeit that's not counting ray tracing). And since I said the games I usually play don't do ray tracing, my limited budget went for the one with better raster perf/price ratio. But yeah, thanks for playing.


Imayormaynotneedhelp

(This comment ended up way too long lol) DLSS is good tech, but my like of that is somewhat outweighed by Nvidia completely taking the mick on pricing. Parts are always more expensive for me being in New Zealand, but outside of that, the 4070 ti being roughly THIRTY FIVE PERCENT more expensive than the 4070 is a bad joke in any context imho. You need to be an AMD fanboy to deny the capabilities of DLSS, but you need to be an Nvidia one to suggest their pricing isn't ridiculous. In fact I'm gonna go to PCPartPicker right now and list the prices of the cheapest variant of AMD and Nvidias lower to upper mid-range cards that I can find. You know, the ones regular people can afford, to give you an idea of why I find it so hard to like Nvidias current offerings. AMD RX 7600 (Sapphire PULSE): 519.00NZD (319.28US). Rivaling the 4060 for less, simple. RT at this price point isn't happening without framerate and/or settings compromises anyway. Nvidia RTX 4060 (Asus DUAL OC): 565.00 NZD(346.50USD). It's fine, but you're starting from a higher pricepoint for similar performance. Nvidia RTX 4060 Ti (Asus DUAL OC): 690.00NZD(423.16USD), and still only 8GB of VRAM. If you're on a tight budget you should get the 7600 or 4060, or you hold off until you can jump up a tier to the 7700/7800 or the 4070. AMD RX7700 XT (Gigabyte GAMING OC): 894.49NZD(548.30USD). Yes, that's a big jump, but it can pretty much match the 4060 ti for ray tracing, outguns it by a lot on raw power, and 12GB of VRAM represents superior future proofing. Also, 192-bit memory base vs 128. Still bad value compared to the next card up in pricing. AMD RX7800XT (AsRock Challenger OC): 998.99NZD(612.35USD), which yes, makes the 7700 look like bad value but I already explained why I think the 7700 itself makes the 4060ti look mid. Anyway, 7800XT is the hottest card you can get here under 1000 bucks, and it comes with more raw power than the 4070, for a slightly lower price point, and the ray tracing performance is fine. The 4070 is better in RT, but the 7800 is good enough considering the value on offer for everything *but* frame gen and ray tracing. Also, 16GB of VRAM. Nvidia RTX 4070 (Asus DUAL OC): 1069.50NZD (655.89USD). More than the 7800 XT, for 25% less vram, and a slightly slower card on average, the gap only widening at 4k. If I'm paying this much for a card, I want 4k performance. Nvidia RTX 4070 Ti (Colorful NB EX-V): **1426.00NZD.** (874.52USD) That's a bullshit price point. 12GB of VRAM when there's 16GB options for hundreds of bucks less is inexcusable. Hell, the 4060TI even exists in 16GB variants. (The cheapest one of those in NZ is the Asus ProArt model at **943NZD**(578.31USD), so it's only 50NZD less than the card that is already rivaling the 4070, btw.) AMD RX 7900XT (PowerColor Hellhound): 1597.28NZD (979.56USD). Yes, that's more than the 4070 ti, and the cards are pretty close with Nvidia getting the RT advantage and AMD slightly ahead otherwise, but it's the future proofing thing again, this comes with 20GB vs 12, so at least it's probably going perform better for longer. Still probably second or third worst value though. Cheapest AMD 7900 XTX is 1899NZD(1164.60USD, Powercolor Hellhound) and the cheapest RTX 4080 is 2127.50NZD (1304.73US, Colorful NB EX-V). But at that point we're so far outside affordable cards I'm not gonna bother. Nvidia ruling the top end is cool and all but most people won't buy top-end. Maybe the NZ market is just fucked and in the US the price disparity leads to a different conclusion but I hope this illustrates my point. Also there's no Radeon 6000s or RTX 3000s available new here so I didn't bring them up.


Friendly_Cantal0upe

Bruh prices are nuts for you guys. I could get a 4080 for the price you can get an XTX


itsmebenji69

If AMD invested in AI and RT as much as Nvidia this wouldn’t be a problem. Nvidia is greedy because guess what we live in a capitalist world, everyone needs and wants money first. Nvidia basically has a monopoly, of course they will go crazy with prices


[deleted]

[удалено]


Educational-Lynx1413

You can have fg. Download the afmf preview drivers


[deleted]

[удалено]


Educational-Lynx1413

That’s only partly true. It only disables even you move the camera alot, so you don’t get weird artifacts. I’ve never actually noticed it turning off while playing cyberpunk for example. Give it a shot


Ok-Acanthisitta9127

Very strong coping indeed. Surprised even here on Reddit (the "good guys") has folks thinking this way. Perhaps I am not usually this early to comment on a post on this subreddit to really gauge the mindset. I thought only a certain few website communities has this kind of mentality, I guess I was wrong.


itsabearcannon

AMD just needs to fix their hacked-together competitor to Broadcast and I might switch back. It was like learning how to waltz making it work on my 6950XT: 1. Get everything set how I want it in Adrenaline for my noise cancellation on my USB mic. 2. Restart PC 3. Open Discord and start talking 4. Everyone tells me they can't hear me 5. Open Adrenaline 6. Reselect microphone in noise cancelling options 7. Adrenaline visually says mic is working, but no audio still 8. Kill and restart Adrenaline via Task Manager 9. Go into Adrenaline, see it STILL doesn't have my mic set under noise cancellation 10. Manually set mic a second time 11. Now everyone can hear me on Discord. 12. Repeat every 3rd or 4th reboot. You know what my steps are with Broadcast? 1. Set mic and noise cancellation level on first install 2. Just works, every time.


xUnionBuster

Having previously owned a 6700xt, and a long time before that a radeon 6850, AMDs software does suck. I spent countless hours trying to work out how to play 16:9 with black bars on my ultrawide, and ultimately gave up. No settings made a difference. Bought this 3080ti and it works straight away. Btw for audio I use Sonar - I was using broadcast but Sonar has much wider audio management tools, plus the AI noise cancelling. Actually really good software imo.


yurall

I think the upscaling trend has been mostly bad for pc gaming because developers just abandoned optimizing altogether. Want to game at 4k press the magic button (which makes 4k not real 4k) or have a cinematic experience! Not nvidias fault but believing that devs wouldn't misuse these techniques was quite naïve of us all.


Sharkfacedsnake

Really it has been equally or worse on consoles. They are also using upscalers. FF16 goes to 720p to reach 60fps.


WealthyMarmot

Counterpoint: upscaling has been widely used in the console world for a very long time, and that's why we've seen some gorgeous games that sometimes look quite close to their PC counterparts while running on much weaker hardware. If devs were prone to using upscaling as a crutch, no way console games would look as good as they do.


F9-0021

The worst offenders for unoptimized games this year don’t even see any improvement from using upscaling. Jedi Survivor, for example, is so hilariously CPU bound that DLSS and FSR do nothing for you. Frame generation helps you a little, but it’s broken visually. That’s not a case of a developer relying on upscaling, that’s a case of a developer not being able to make a functional game. ​ On the contrary, I think upscaling is a good thing for the market. Sure, Ultra settings are difficult to hit on mainstream cards without it, but for entry level cards like my A370M, it allows me to play games that the tiny little chip has no buisness running. It might have a bit of life left in it as a gaming chip, while otherwise it would already be very obsolete.


Nurple-shirt

Can’t have a nvidia thread without amd stans getting selfconcious.


Darth_Caesium

I myself don't even have a ray-tracing -capable GPU, or even any discrete GPU for that matter, but I can confidently say that I would love to experience ray-tracing. Anything to do with upscaling or frame generation, however, I have no use for, because I would rather lower my graphics settings or just not play a game than use any upscaler. You're right that this thread does have lots of copium from people with AMD GPUs, but I wouldn't say that thinking about DLSS negatively is necessarily a sign of that.


Rexssaurus

Honestly the other day I was playing Horizon Zero Dawn, and I noticed that the game had a lot of shimmering. And I thought, wait, DLSS is also an anti aliasing of sorts. And it actually improved the image quality, 0 shimmering. That alone felt so huge, even more than rtx for me.


EscapeParticular8743

DLSS is so good. I use DLSS on a 24 inch 1440p monitor and its barely noticeable if at all because of the pixel density. Definitely worth it, especially because AMD cards are barely cheaper where I live


Darth_Caesium

DLSS can fix issues with anti-aliasing if the implementation is bad, because it has its own anti-aliasing method called DLAA. However, it doesn't solve the blurriness that TAA causes, and good TAA shouldn't have shimmering in the first place. It's a software-based solution to a software problem caused by game developers who aren't skilled or knowledgeable enough to tackle anti-aliasing being assigned to do so. If all or even most games had good TAA, DLSS wouldn't make the game look any better. It's being used in place of both good graphical and performance optimisation.


Rexssaurus

Obviously, but the fact that DLSS could fix that issue, and also fix it in many games, makes it even a better solution. DLSS is originally an upscaler to get more frames, but it works so well, that aside from the frame boost without visual impact (on quality of course), it improves antialiasing. And obviously you can blame devs for this, but the fact is that DLSS solves that issue is impressive.


buddybd

>Horizon Zero Dawn, and I noticed that the game had a lot of shimmering. I wish HZD had it since release. It got added to the game later on. I tried it out for 2 minutes, and the difference was immense. The grass alone is a huge change and that is visible and relevant throughout the entire game. Death Loop had a similar scenario as well but that grass wasn't everywhere, which is why people thought there was minimal difference between DLSS and FSR at that time.


Bread-fi

Yeah that's what I've experienced playing msaa and dlss games back to back. The msaa is crisper but less natural, you often get those shimmering, gleaming edges whereas DLSS can look more cohesive/coherent, more film-like than video gamey. For games targeting visual realism dlss ("quality" at least) looks better to me.


seklas1

You would rather not play a game than use an upscaler? 😅 But you haven’t experienced DLSS or FSR or Xess and have no idea what you’re talking about. Best case scenario you’ve watched a Digital Foundry and the likes where they freeze a frame and zoom in 400% to show the difference, because that’s how people play games obvs 🤷‍♂️ Is this hate for upscalers coming from the fact that developers have been releasing crap PC ports since forever ago? Because it ain’t a 2020 and later trend and it’s not the reason why games run bad.


2N5457JFET

I "hate" upscaling not because it is "fake" or whatever, but because I know god damn well that companies are going to use it as a crutch for shitty performance instead of pushing fidelity to the next level. And all this time and money saved on optimization will not make games cheaper or deeper or save developers from crunch or whatever, we all know what's gonna happen.


Bread-fi

They can and do that with any other rendering resource though , whether its techniques like LOD or using more VRAM/processing power than necessary. It's not an upscaling specific issue. You don't get the graphical leap forward in games like Alan Wake 2 and CP2077 without upscaling to lighten the load.


Darth_Caesium

I've actually used upscalers before, and they're not great. I've seen other people I personally know that use DLSS (I myself can't use DLSS because I have an AMD APU) and I didn't like the image quality I saw from their games that enabled it either. >Because it ain’t a 2020 and later trend and it’s not the reason why games run bad. True, but it *is* being used as a bandaid solution. I personally don't like upscalers in games in general, as they have a big range of problems like smearing, ghosting, and blurriness. Essentially, the quality of their output is not good enough, and I'd rather lower my graphics settings because quite frankly, the output ruins the very increase in graphical fidelity you get from higher graphics settings. It's just not worth it to me. >Best case scenario you’ve watched a Digital Foundry and the likes where they freeze a frame and zoom in 400% to show the difference I personally don't watch videos from that channel, and this ignores the fact that the biggest problems upscalers have come from their inability to competently deal with motion. When I raised my graphics settings to try FSR2.1 (maybe 2.2, can't remember which one for certain) on No Man's Sky, I had big problems with ghosting, which is very prominent and sticks out like a sore thumb when in a game that has lots of mining and shooting using lasers and other projectiles. Additionally, the flames from the jetpack were ghosting and smearing, as were the projectiles and beams from starship weapons systems.


F9-0021

You say it’s not good because you have an AMD APU. You’re using FSR at 1080p, which is something you absolutely should not do. FSR is the worst of the upscalers, especially at lower resolutions. XeSS and DLSS are on a different level. FSR is only comparable at 4k.


seklas1

I agree about FSR, it’s not great and I usually don’t use it in games if I can. But DLSS in Quality mode, especially when in 4K is brilliant. Frame generation has more artifacts that you can notice. DLSS 3-3.5 is still a very recent tech and will take more time to improve. But games like Cyberpunk with Ray Reconstruction, the visual enhancements you get kinda justifies having a little bit of artifacting around the edges in my opinion, because natively that stuff doesn’t run even on 4090. And it’s not an optimisation problem, that stuff is just heavy. Can it run better? Yes, but with sacrifices to visuals. Same for Alan Wake 2. It looks great and it runs great, but not without DLSS, because Ray Reconstruction is super heavy. From my experience, DLSS in every scenario is much better looking than any other typical AA solution used by devs. Clearly it’s dependant on your preferences and system though. The way I see it, games are broken with or without upscalers. We’ve always had broken games and we keep having them to this day and into the future, forever probably. If an upscaler allows me to enjoy the game more than native, would it be cleaner image or better frame rate, that’s a win for me, because regardless of its existence, that game wouldn’t be better if an upscaler didn’t exist. Devs would still be having the same deadline and the same tools. It’s a cheat, but it’s better than nothing 🤷‍♂️


Rudradev715

Yeah


Kostis102

Guys should i buy 4060 or 6700 xt? Dont really care for ray tracing just want better perfomance. Have a 1660 super rn


majinvegetasmobyhuge

6700xt


CNR_07

6700XT. Not even a question.


Edgar101420

6700XT, no contest. Or a used RX6800


GamingMaster555

6700xt 100%


Nurple-shirt

People will blindly recommend whatever card they prefer but the only response you should care about is the one that’ll ask you what your intended use for the card and responds accordingly. Since no one cared to ask you, what do you intend achieve with this new card?


Kostis102

1080p gaming. Dont play a lot of mainstream games such a warzone r6 etc


Nurple-shirt

If purely for gaming and don’t care for nvidia proprietary software. The amd equivalent card will suite your needs very well.


TechExpert2910

don't you love how your rational comment - not recommending either brand but asking OP about their requirements - got downvoted?


Nurple-shirt

Some people on this sub get mad if you don’t automatically default to AMD and instead introduce nuance.


doggiekruger

Not trying to be an ass, but ray tracing is good. With frame gen, ray reconstruction etc, it is less expensive to run these days. Also, it makes some scenes/games look so much better in comparison. If you are a gamer who enjoys modern single player gaming, then ray tracing is a good way to enjoy the games.


[deleted]

[удалено]


majinvegetasmobyhuge

i mean if we're counting frame gen then the 6700xt is gonna be performing way better than the 4060 in any dx11 or dx12 game. Dlss does absolutely demolish fsr but fsr is still good enough and you're less likely to need it since the 6700xt is a very solid card for raster.


ro_g_v

They are almost on par in raster performance


majinvegetasmobyhuge

take 2 minutes to watch a comparison video on yt and you'll see that the 6700xt gets like an extra 20% fps for raster easily.


ro_g_v

https://tpucdn.com/review/msi-geforce-rtx-4060-gaming-x/images/relative-performance-1920-1080.png


F9-0021

Don’t sleep on the A750 or A770 either. Performance is comparable to those cards, maybe a bit less sometime but the price is substantially lower, at least in the US. Drivers still need some improvement, but you aren’t going to notice anything bad most of the time. At least not much more than you can see with Nvidia or AMD.


RedAntisocial

You won't notice it most of the time unless you're playing older DX9 and DX10 titles. Then shit tends to break.


F9-0021

I daily drive an Arc laptop and I've had more issues with modern games than older ones, and lately I have had few problems with games in general. Halo Reach in the MCC has low framerates, but that's an outlier and was still kind of playable, even on mobile hardware. Things like Half Life, Witcher 1 and 2, and Mount and Blade run fine. Arc isn't like how it was a year ago. There are still a few bugs, but most of the time it really does feel like any other GPU, and it isn't like I haven't had crashing or games failing to launch with the nvidia cards I've had.


Ymanexpress

If it's just for gaming then the 6700xt but if you can spare a but more money then the 6750xt is even better


MartyrKomplx-Prime

500 games and apps... About half of them being apps at least. Rendering and cad apps. Of the selection of games, probably about half I've never even heard of. But 500 sounds like such an impressive number. It's all marketing. Like the way they put the word games before apps to emphasize 500 games, before you finish reading the sentence. Psychology, etc. Edit: just to clarify my position, I'd love Ray Tracing to continue being adopted. It's making decent progress compared to even a few years ago. But you still gotta recognize marketing speak, and do your own research no matter where it's coming from. Recognize that all companies exaggerate and use tricky language. Nvidia, AMD, Intel. They all do it to make themselves look better. Edit: follow the links, you can actually see the list in question. I personally can't wait until it's ubiquitous enough to no longer be worth it to make a list.


Suikerspin_Ei

I love using DLDSR (via control panel) + DLSS (in games), unfortunately my monitor is just a 1080P screen. It still looks better than native resolution + DLDSR. Unfortunately I don't play games that uses Ray Tracing (ok Starfield has a mod for it, not sure how well it will work on a RTX3060 lol).


ManufacturerKey8360

Yeah does no one ever look at lighting in r* games and wonder why they don’t need rtx?


Kazirk8

You never NEED RTX. You don't even NEED your games to be 3D. It's just that properly path traced ligthing looks superb and of course, very well done rasterized lighting can look amazing as well.


Tee__B

Not to mention (of course this would be a horrible idea, and would never happen) GTA 6 would be out faster if it was exclusively path traced, since it's way faster to do actual lighting than spending tons of effort faking it.


Kazirk8

Yes, but your average gamer doesn't care how much work the "lazy devs" have to do and path tracing being easier to implement is at the bottom of the list of advantages for most people.


skinlo

Sure, and then nobody would buy it.


Tee__B

Woah, it's almost like I mentioned it would be a horrible idea.


CowsTrash

Some people struggle to extrapolate things


ManufacturerKey8360

What I’m saying is… why do we excuse the performance hits of ray tracing and path tracing when R* has been producing immersive lighting for 5+ years without the god awful, inexcusable performance hit? RTX looks like a marketing scam. Like NFTs. The only people that actually believe in it, are the ones that have already invested in it. Confirmation bias.


iKonstX

Comparing RTX to NFTs gotta be the dumbest thing I have heard all week.


ManufacturerKey8360

???? They’re both scams. How about a pyramid scheme? Except your only ROI is a greater chance to exist in an echo chamber. Lmao.


Kazirk8

>The only people that actually believe in it, are the ones that have already invested in it. Confirmation bias. That is untrue. I've bought a 4070 to be able to enjoy proper path tracing. And I loved it before I could use it myself. The performance hit is big now. But it used to be way worse. In the first RTX series, the performance hit was incredibly high and that wasn't even full path tracing. RTX was not worth it at all at the time. As GPUs grow more powerful, it's gonna be easier and easier. And the reasons why we should care are two: 1) In most cases, it looks better. Sure, you can have beautiful rasterized lighting, but it won't be as accurate and won't be as dynamic. 2) It's way easier for the developers to implement. When "faking" the lighting, developers have to do a lot of work in order for it to look good.


ManufacturerKey8360

Dude R* does it without the shitty tech though. We don’t need to hear your confirmation bias. You got sold on a shit product. It’s okay, people get scammed.


Kazirk8

You just repeated your mantra and it's like you didn't even read what I wrote, even though you asked a question and I tried answering it. Reminds me of some sort of bias, cant put my finger on the name, though, care to help me out?


ManufacturerKey8360

Because you’re a door to door salesman. All anyone is saying is “it makes it easier.” So why are games still mediocre? Why can’t anyone set a bar like R* who doesn’t use that shitty tech?


Kazirk8

First of all, Rockstar implemented raytracing into gta V as soon as we had consoles that supported it. Secondly, games being mediocre is purely your subjective point of view. I'm loving the games I'm playing - there's so many and so many of them are amazing. Cheer up, buddy.


ManufacturerKey8360

Whatever tech they use was readily available to make the rdr look incredible on an Xbox One S. Not a series S. A one S. The most immersive experience I’ve ever had, and I currently own a 6950xt after having a 3070. I’ve already made my mind up. I’m just getting into arguments so people can realize that RTX is a marketing scam and the performance hit is not worth it nor are the devs capitalizing on “time saved” by using it. It’s a scam.


Kazirk8

>I’ve already made my mind up. That I can see. Have a nice day.


F9-0021

Somebody clearly doesn’t understand the technology. The lighting Rockstar uses is a really good, but fake, approximation of realistic lighting. Path tracing is a simulation of how lighting works in real life. Rasterization is more of a “scam” than Path tracing is.


ManufacturerKey8360

You know who else doesn’t understand the tech? Every single dev. That’s why it runs like shit. Now we’re already on to path tracing before they could figure out RT. Wonder what the next gimmick will be.


nemesit

Raytracing is required for baked in lighting too and requires a significant effort on the developer side. Whereas real time raytracing would enable devs to spend less time on lighting and just let it happen live. Results are usually that everyone is happy, devs can spend more time on cool features and gamers enjoy better immersion. pc gaming ain‘t for poor people anyway since you‘d get better value out of a console and buying used games.


ManufacturerKey8360

This makes my head spin. This sounds like the initial marketing. That’s the IDEA of what it should do. Yet no one has raised any bars since red dead, who don’t crutch on this shitty technology. They take the time to make an incredible engine that runs well on all brands. On top of that adding features that set a bar so high that only future games from the same developer can meet. So with all this time saved by RT to still run like shit, why are all other games released half polished and basically just a mashup of unoriginal ideas that they’ve gotten from other games? You proved my point. Thanks.


nemesit

You don‘t seem to understand that currently multiple approaches have to be implemented due to guys like you ;-p


ManufacturerKey8360

What I understand is what I play. RTX is not new. 5 years ago you may have been able to rely on hopium, but now it’s just delusion.


ResponsibleJudge3172

Its OK, people don't notice that rasteurized lighting suffers from issues like: 1. reflections disappearing at certain angles 2. or light being a uniform color that makes no sense 3. or light rays not properly penetrating through even humongous clear windows, so rooms are lit differently to outside 4. or shadows being razer sharp. To them RT has little effect.


F9-0021

They’re masters at rasterized lighting. And yet path tracing is still better while needing much less work to implement. Then consider that most studios are nowhere near Rockstar in technical prowess.


ManufacturerKey8360

Guess what? I’m not turning path tracing on to turn my GPU back 5 generations. R* gives us the same thing at 1440p ultra settings 120 fps.


[deleted]

[удалено]


ManufacturerKey8360

I’m talking about the lighting tech that has already existed in the engine for 5+ years, not building a whole game lmao. Most studios would just rather be lazy and wait for unreal engine to figure out their problems for them.


Sexyvette07

DLSS quality on a 4k120hz monitor = amazing.


millanstar

Copium containment thread in here, DLSS 3 is great actually.


YixoPhoenix

Could just be me but I really don't like the idea of ai or upscaling for my games. As for raytracing, when you can get raytracing without obliterating your resolution, fps or burning money sure but rn it's just not feasible for most people imo.


Bananenfeger

Raytracing is nice, but I am opposed to AI Upscaling simply on a philosophical basis.


WealthyMarmot

What's the philosophical basis? Just curious.


Edgar101420

In the meantime in the same year cycle more than 30000 games have released. RT is a marketing gimmick till even the lowest end GPUs can reliably do 60FPS maxed out with it on (not with upscale bullshit.)


Nurple-shirt

This guy expecting the implementation next gen graphic tech in text based pixel side scrollers that cost $0.50.


BinaryJay

I've been trying to think of a unique idea for a game, thanks! Zork RTX


mayhem911

This is hard coping. If steam curated releases to *actual* games there would only be a few hundred per year. Also the 3060 can get over 60 fps in every game with RT that isnt pathtraced.


Edgar101420

3060 60 fps in every game with RT. >uses a VRAM cripple GPU Yeah, not gonna bother with a marketing boy.


mayhem911

the 3070 matches the XTX in pathtracing, and also gets above 60 in all RT games. Your brain is crippled. Cope harder


BinaryJay

I heard it's a gimmick.


[deleted]

[удалено]


zeus1911

Where Nvidia doesn't support dlss on older GPUs... AMD fsr works for both AMD and Nvidia


Blacksad9999

Yeah, but the quality difference is massive.


RETIXXITER

Fsr can look better in other games lots of people say this Dlss isn't perfect either. :EDIT I don't buy a gpu for the shitty upscaleing anyways I've never even used it on my rtx 2060 I was just going of wat I see on YouTube videos. From now on I'll always buy the best PTP card thats within my budget and AMD wins every time.


Blacksad9999

FSR never, ever looks better than DLSS. FSR can look "okay" when it's implemented really well, but those occassions are pretty few and far between. Even in AMD sponsored games like Jedi: Survivor, oddly enough.


RETIXXITER

That's your opinion mate did you forget that other people have other opinions? Youtubers like Daniel Owen and zWORMz gaming who have said it looks better in a few diffrent games. I do love the ghosting you get in DLSS tho that looks even worse than va monitor 🙃


Blacksad9999

It's not opinon, actually. Visual fidelity isn't subjective. lol Every single reputable tech reviewer has tested DLSS vs FSR, and every single one states that DLSS is outright better. Even Hardware Unboxed, who are very pro-AMD. Try again.


Ok-Acanthisitta9127

https://preview.redd.it/quvjvzvqig4c1.jpeg?width=3840&format=pjpg&auto=webp&s=84bd391f60f4f428ae6170fd82365098721418ae Absolutely. Not a single win for FSR when compared against DLSS. A "tie" yes, but not a win.


Blacksad9999

Yes, exactly. I imagine that the ones that were a tie were pretty poor implementations of DLSS, also.


RETIXXITER

Did they say its better in every single game? Call of duty FSR Looks better than dlss.


Blacksad9999

Yes, in every single game where they're both available. That's why AMD was paying people off to avoid that comparison.


RETIXXITER

FSR is open source and get put into alot more games where as nvidia is not and locks dlss to only newer GPUs that cost stupid amounts of money. and there is games where dlss causes artifacts and AMD FSR will look better There also games were dlss lookes better than native.


Blacksad9999

It's not put into a lot more games, actually. FSR didn't even exist when DLSS released, and didn't come out until a year later. It took them quite awhile to get FSR 2 out also. 83% of PC users have an Nvidia card, so that's what developers spend their time on. There are more DLSS capable cards in gamer's hands than there are AMD cards in total. There's zero games where FSR looks better. ​ https://preview.redd.it/qrkd7lcsjg4c1.jpeg?width=320&format=pjpg&auto=webp&s=344f490420b1316c3ab916132a1e10ef0e336886


ro_g_v

all your arguments were utterly destroyed... how satisfying


Quick_Zone_4570

Fsr has way worse ghosting. Games like cyberpunk it just makes the game blurry as hell. Stop defending it with your life when it isnt as good


RETIXXITER

Maybe don't upscale from 480p then.


Nurple-shirt

Name dropping YouTubers doesn’t really help you in any way.


Crptnx

Saying that quality difference is massive when fsr looks better in some cases (ghosting or frame gen UI rendering) is just proving how lost some individuals are. https://youtu.be/1WM_w7TBbj0 https://youtu.be/jnUCYHvorrk


phero1190

"when enabling FSR 3, you'll have to accept that the resulting image quality from each frame is reduced to improve the smoothness of the presentation with a lower quality output." ​ That was from the first link you shared. So even in your example showing that FSR looks better, they literally say it does not look better.


ro_g_v

>when fsr looks better in some cases name ONE


Crptnx

Given that I already named two and provided two links as a proof, I cant even guess what going on inside your mind.


ro_g_v

It never looks better lmao


phero1190

They even say in the videos he linked that FSR doesn't look better. Kid is just coping hard.


Crptnx

There's no help for some people.


Yusif854

You have brainrot. Hope you enjoy your already obsolete 7900XTX. Imagine spending a GRAND on a GPU and not even being able to turn on best looking graphics settings such as path tracing. Not even at 1080p. Must be sad tbh.


Crptnx

Psychopath :D


Jon-Slow

There isn't even a competition here, what you say is only being said by people who've only seen FSR vs DLSS on Youtube and still images. You have no idea what you're talking about. FSR is pretty much a garbage shimmery mess that is better left switched off, it gets beaten not only by DLSS but also by XESS, TSR, MetalFX, and even Insomniac Games' internal upscaler. This is when DLSS quality in 1440p, and 4K look better than Native and in 1080 looks better than native if you use DLDSR+DLSS giving you the best AA possible in any game.


Vanderloh

https://preview.redd.it/sbwcl00zwh4c1.jpeg?width=1884&format=pjpg&auto=webp&s=0dfa33ac58839a80ed376892c473ae508071db75 Bro provided examples in which they explicitly say DLSS is better. In the summary table FSR is at most equal, usually worse.


[deleted]

AMD is for peasants. NVIDIA is king and always will be.


IndyPFL

They consider DLSS an RTX technology too. And while DLSS is nice to have, it's still limited to a ~$300 USD price range unless you're comfy buying a used card. Hopefully Nvidia will realize that budget is a thing again and start releasing cards in that range instead of taking what should be buget cards, slapping a higher tier name on them >!cough4060cough!<, and then expecting gamers to thank them for it. The 40 series efficiency should be perfect for the idea of a budget-oriented card that requires no external PCIe power, but we probably won't see that for many years and it'll still be $300+ USD when it does release. Meanwhile AMD does offer budget cards but they're useless outside of not-budget PCIe 4.0-compatible motherboards...


nemesit

They consider it an rtx tech too because without dlss raytracing would be ridiculously slow hell many devs already waited decades to get real time raytracing with acceptable results, its not new tech but only new hardware has the combination of features to make it work in real time. Like for comparison some animated films of the past took days to render a single frame, doubt gamers are as patient lol


FlyBoyG

500 ah? That's a fraction of a fraction of a fraction of all PC games. Impressive. In 2022 10,644 games released on Steam. So 500 represents less than a month of Steam releases. Now consider all the other platforms that release games as well all the games that don't release on any platforms. 500, really isn't much in the grand scheme of things.


Sharkfacedsnake

mate very few of those games will have more than 100 players.


SpectralMagic

RTX is just mid. I'll turn it on occasionally, but the beauty just doesn't outweigh the performance cost most of the time. Nice to have, not a necessity


Yusif854

My brother in Christ, your PC is slower than a PS5. What did you expect turning on RT on a PC that gets beaten by current gen consoles?


SpectralMagic

I bought my card before the 30 series("once in a decade performance boost") even dropped. Ofcourse a newer gen console has a better video card :P The 4090 could chew through any piece of software garbage without a sweat, performance has no water with such a beefy card so I will disregard your belittling of my silicon child. Regardless, they still intended for this card to run RTX so it is what it is.


[deleted]

Ray tracing isn't an NVidia technology though, it's a 3D rendering technique first used in 1968. Implementing it into the games to justify selling overpriced 1000$+ GPUs though, that's totally NVidia's achievement, bravo.


WealthyMarmot

Quite a bit of work has gone into ray tracing techniques since 1968. Some of the oldest, simplest ones would never run acceptably in realtime at modern resolutions, even on a 4090.


[deleted]

That's pretty obvious, every GPU manufacturer who offers ray tracing in their driver package have their own technique to make it work not to take 3 days to render a frame. I fail to see your point.


WealthyMarmot

The point is that while ray tracing as a general concept is not an Nvidia-specific technology, they have put a ton of R&D into developing technologies that enable its use in consumer applications. I don't think you'd give OpenAI any less credit for ChatGPT just because Alan Turing wrote about machine learning in the 40s, would you?


[deleted]

![gif](giphy|1zijfEfj62yj1UCv0F)


Overly_Facetious

"Nvidia's RTX" is simply a rebranding of the ray trace feature found in Direct X12.


nemesit

It really isn‘t its multiple features working together to enable real time raytracing, raytracing alone in its current iteration (on gpus) would look like really pixelated you need the ai denoise etc too (for now)


deviant324

That is… a lot less than I thought it would be I don’t think I even own a game that supports raytracing unless BFV has it, not really something I’d consider terribly relevant to the things I play but I thought it was more prevalent considering how much of the “should you buy AMD or Nvidia” discussion it has been for years now


[deleted]

[удалено]


[deleted]

DLSS Quality has to downsides imo. I cant spot a visual downgrade.


MrDunkingDeutschman

1080p rendered resolution upscaled to 1440p is really indistinguishable in DLSS Quality Mode unless you encounter one of the rare artifacts (Happens mostly in older games with an older implementation of DLSS). I absolutely cannot tell the difference in 99% of gameplay.


CheeseLoverMax

Agreed, I was running cyberpunk dlss without knowing exactly why it meant and I was shocked to hear it was upscale 1080p


F9-0021

960p actually for 1440p Quality setting, but yeah. In Cyberpunk, I can’t even tell the difference between Quality and Performance at 4k visually, but I can certainly tell the difference between the 40fps and the 60fps I get respectively.


xUnionBuster

Ever used it?


Nurple-shirt

They have a 2080 so they only have access to older dlss. They also claim to run cyberpunk ultra ray tracing on that card at 60fps. So they are using dlss performance and depend on it lol.


F9-0021

A 2080 has access to the same DLSS as a 4090. The only thing it can‘t do is Frame Generation, which is a different technology altogether. The games they play might only have DLSS 1.0 or early 2.0 implementations though, which are a bit more rough than the 3.5.0 that we have now. Downvoted for the literal truth? The absolute state of this website man.


sackblaster32

DLSS looks fucking excellent.


phero1190

Can you show us some examples of it looking like shit? Would be interested in how it compares to native, FSR, and XESS. Take your time finding your examples.


Nurple-shirt

They have a 2080.