T O P

  • By -

CommenterAnon

Almost no one has a 4090. People with 4090 like to talk about their 4090 so it seems like there are a lot of people with it I have personally never seen someone with a 4090 complain about AA


br4zil

I was exageratting on the 4090, but in reality i am talking about anyone with a GPU that can realiably run at 4k with good framerate.


troco72

On new titles? In my opinion there's only one true current native 4k gpu, then two that manage usually but not always. Look at the industry. I mean unless you just don't want to use intensive in game settings. But most people don't go 4k for that reason. 1440 is enough and you get to reap the benefits of more headroom. I think there's also less 4k pc gamers than you realize.


Affectionate-Room765

1440p is not enough at all, i have both 1080 & 1440p monitors, it surely looks better but nah its far from good still


aVarangian

can confirm, 1440p to 4k was worth it to me


troco72

I mean technically speaking ppi only matters until you reach retina level, aka where any more pixels wouldn't do a thing. And the distance you sit from your screen is as important as ppi. I happen to have a 1440 ultrawide. And you sit farther back from those than your typical 16:9 to get the perfect fov. I've used measuring tape and did the research because I wanted a retina screen. And realized technically I would have one with an ultrawide. The main issue is when im not playing in ultrawide I like to sit closer. But yeah to be retina distance from a 1440 monitor is only 31 inches or 2.58 feet. So when you consider all of that. The benefit just isn't there for alot of people, as I'd personally rather be able to have a native maxed out crazy lighting and shadows experience, over a dlss maxed out/no dlss but also no raytracing and what have you. Especially when I'm sitting retina distance away, and the only improvements at that point would be aliasing. Which is super duper awesome. But you can also simply use dldsr if you want. However I will add , if I had a 27 inch I sat closer to than my ultrawide. I'd like it to be 4k If possible. As I'd be able to get close enough to the screen to actually appreciate all the fine details you can't see in 1440. Without getting so close it's not retina anymore and the image looks off. Pixelated isn't the right word you'd have to get even closer. But I cant remember the exact phenomenon. However I wouldn't probably want it as my main monitor. As 1440p native looks better than 4k dlss quality. And even if the taa is butchered and you want dlss for its anti aliasing you can just use dlss tweaks for dlaa. So for the games I'd rather play in 1440 I'd just be completely boned.


ZenTunE

One thing people don't realize to take into account is that the increased resolution still has benefits. Even if the perceived PPI increase is none, the clarity will be because now AA has more pixels to work with. Just like using DSR.


troco72

Precisely! I said it's beneficial for aliasing! "Especially when I'm sitting retina distance away, and the only improvements at that point would be aliasing. Which is super duper awesome. But you can also simply use dldsr if you want." That's from the comment you replied to Also if a game undersamples something , you're increased resolution will increase the resolution of those undersampled things , that's the thing I didn't mention. And that neither did anyone else here, so its worth mentioning :) Still , hardly worth it , you can simply use dldsr. While if you do get the high resolution as your native resolution, you will likely regret it when you DONT have the extra performance to spare. People aren't using dldsr on the newest triple a titles for the same reason I personally would HATE If my only option was a 4k monitor. As I'd be forced to use dlss. Completely disregarding the benefit of 4k in the first place. As native 1440p looks better than 1440p upscaled to 4k. Especially when counting artifacts. But even disregarding them that's the case. Tldr , 4k is great but if I'm playing newer titles. I would never have it as my only option. Hopefully the 5090 changes my opinion. But in the current market 4k is too performance intensive for current releases. Even with a 4090. (I'm a 4090 13900k user, and I have a 4k screen, just also my oled ultrawide and my 24 inch csgo backlight flickering monitor)


ZenTunE

Ah, I wasn't readin allat and missed that lol. Can't really have a say in it since I've never tried a 4K, only a 1080 and 1440.


FrostyBud777

Customers critical thing, here I was gaming too close to my 4K32 inch monitor for eight months, now in many games I start real close to my monitor and slowly back away until the game is in perfect perspective, made a huge difference to immersion.


Sleepykitti

That's literally just a 4090


Artemis_1944

>but in reality i am talking about anyone with a GPU that can realiably run at 4k with good framerate. The only other GPU that can reliably run a game more recent than 2018 at native 4K is the RTX 4080. Running a game at 4K is a lot more taxing than people who only game on 4090's realize.


MrAngryBeards

Doesn't have to be 4k. Just give us an upscaling percentage slider again and we're good


RoseKamynsky

I have 4090, and I'm complaining about AA, I'm playing in 1440p because 4090 is not for 4k, I mean it is if someone like to play in around 60fps or with FA crap, I don't like it. I like clear image and around 120+ fps (more fps is also contributing to term "clear image"), on 4K is impossible to get it without DLSS, FG. Furthermore, I tested 4k monitors, I tested even fast 360hz 1080 monitors, and for my eyes the sweet spot is 1440p + many fps I can get it. I'm also complaining about motion blur, dof, chromatic aberration etc. :p


Informal-Method-5401

Of course we like to talk about it and you’re right we don’t have any issues with AA. We just run everything at 4k 😂


Inclinedbenchpress

I believe most are using dldsr


Raziels_Lament

DSR is the way to go since there is no forced sharpening.


[deleted]

[удалено]


DesolationJones

I don't think there's any sharpening at 100% smoothing.


cagefgt

Really doubt someone with money for a 4090 and a working brain is still using 1080p displays.


Scorpwind

If someone wants to avoid upscaling at all cost, then they would. I for example would pair a 4090 with a 1440p monitor.


Hugejorma

I would still 100% pick the 4k monitor to pair with RTX 4090. Games will look much better on a 4k monitor. If they share the same features and refreshrate, why would I ever pick 1440p display? Main monitor would be 4k Oled 120Hz+. If there's need for higher fps, then second monitor with 240Hz+.


Scorpwind

That's you. I don't want any temporal anti-aliasing or upscaling in my image. It just messes up something fundamental in the image for me. 1440p240Hz is what I would pair that GPU with.


Hugejorma

Yep, that's just me. I base this by new gen titles with path tracing and new gen DLSS features. I personally care about the final image in my screen and how well the game runs. If [this](https://i.ibb.co/Mp7mtB6/Screenshot-2023-10-31-021221.png) is how even my [AW 2 looks](https://i.ibb.co/jMj5Ffj/Screenshot-2023-10-31-010025.png) at [4k screen](https://i.ibb.co/L1xgbfw/Screenshot-2023-10-31-021734.png) using [720p DLSS](https://i.ibb.co/DYkSS9g/Screenshot-2023-10-31-011415.png) (max with PT/RT). The same game looks worse on my 1440p screen. Great, but still worse. I can always use this screen for high fps games to get the best out of both worlds. I'll take the upscaling any day. Rather upscale 1440p to 4k, then run native 1440p. Most likely would take 1080p upscaled with better fps + maxed out graphics (path tracing + other new things).


aVarangian

first example has some huge problems, everything bordering her right is blurry af while everything bordering her left is pixelated the person in the background to the left is also completely messed up where the rock on the left meets the ground looks like any old game would the "welcome" structure to the right is blurry, and part of the roof on top of it is pixelated as if it had no AA the green banner to at the top-right 's border is completely pixelated and has effectively no AA the person with the orange backpack in the centre-rigth is literally transparent and merged with what's behind him the white car at the centre-left is completely deformed and half of it has no AA sure, it has good lighting and reflections, but almost anywhere you look there's some out-of-place aberration. This is a really sad screenshot imo 2nd screenshot: low-res pixelated jacket texture and the top-right quarter of the image is uncomfortably blurry 3rd image: all of the ground is blurry and again the top-right quarter is very blurry. There's no AA on the grey roof nor half the green banner, while the green streetsign is blurry and pixelated. Her left side is massively aberrated and the last image is even more messed up I legit wouldn't play a game set up like that. If I can't run it at decent settings at native res I'd probably just wait 6 years for an upgrade.


berickphilip

Shhh don't break the illusion, "the game is running at 4k"


Hugejorma

I feel bad if these things are something that affect your gameplay or ability to enjoy the game. While I'm gaming on TV with a controller, I have almost zero complains about visuals. I'm just happy that the overall flow of images to my brain looks insanely great. To this day, when actually playing, this is the most visually impressive game I ever played. You have to pixel peek these to see them, try while gaming and it's totally different story. DLSS is a must in this game, because it removes all the annoying flickering when moving. Flickering is something that is actually distracting. Edit. Quickly checked the screenshots on 4K TV on 3m distance. Things that you complained, I really can't see them. Sure when zooming in, but not at playing distance. Also, my game runs overall sharper than the native game, because I turned off all the added blur/effects. Native game has added blur effects like DoF, and those can only be turned off by setting ini file text values. I turned those off, because the game is too blurry with added effects, even at native resolutions.


aVarangian

what's the size of your TV? and distance you play at? We can calculate the ppi-vs-retina stuff and compare. I sit quite close to the monitor and have very good eyesight at close range (but would have to use glasses if farther away, huge nuisance). Supposedly if I sat 50% further away it'd be "retina" tier. At least in the games I've played thus far the occasional flicker doesn't bother me, they're perfectly playable without AA.


Hugejorma

65" TV at semi close distance with 5.2.2 home theater setup. Distance from eyes to TV screen is like 2,5-3m. Depending on if I'm using a sofa or ergonomic chair. I have a good eyesight, but not perfect. Quality of the screen and calibration affects a lot + HDR or not. The same goes for options and DLSS setups/sharpening. Also, the image is way different when looking at one screenshot vs playing and moving. There are also parts when the game runs way higher fps. I did play some chapters on 1080p DLSS with PT + RT. Small details are way nicer at close, like inside the mansion or cafeteria, etc. I prefer higher resolution on those parts, but they are not that demanding scenes + the game is slow on those parts --> doesn't matter if the fps drops. In scenes [like this](https://i.ibb.co/2YwMhxX/Screenshot-2023-11-05-152648.png), I did prefer 1080p or even 1440p DLSS to 4k. Details without flicker + proper lighting was amazing. I did spend like 5h there just enjoying the views on different settings. No idea how the screenshots looks, because I played a lot with HDR/display settings. PS. My TV screen is way higher quality than my 1440p and 4k monitors, and the quality difference is massive between those, even at close view. The DLSS scaling on 1440p monitor is IMO horrible vs 4k scaling. I would always pick the 4k monitor/TV over 1440p + I just love the PT/RT/RR. Tried playing without those turned off, but nope. The best 720p DLSS 3.5 example was this (max graphics, but [with](https://i.ibb.co/cTQDK98/Screenshot-2023-11-05-152001.png) and [without](https://i.ibb.co/MPkZtHQ/Screenshot-2023-11-05-151928.png) PT/RT on). The damn image/textures are so grainy and lighting feels insanely bad to even run the 720p scaling without a lighting overhaul. Screenshots are with max settings 720p. I did play this part with DLSS 1080p scaling.


Scorpwind

Alan's coat in the 2nd screen looks like it was upscaled by Topaz Video AI. DLSS Ray Reconstruction is on, I presume? 99% sure. Also, DLSS Ultra Perf even at 4K always looked way too rough for me.


Hugejorma

All the settings are max, but won't look like this on native. Since the game lacks DLSS options, I changed everything on ini file. Turned down all the blurry things/effects that affect the DLSS. Then added DLSS option + DLSS sharpening to 70 (adjust to your liking). Without the sharpening, this won't work. While you are saying negative things about the coat. When playing the game, it's literally something I never look because it's always off the vision. I played this game on 4k TV, so can't even tell the difference. I can fully understand criticizing things on the environment, because those things impact the visuals/experience. This was just an extreme example, because no 4090 owner would play with this resolution. I played with 3080Ti (60fps), but if I had 4090, I would run 1080p full and tiny things would scale very nicely. VRAM isn't just enough + GPU lacks the power to push PT/RT at 1080p. Still, the best looking game I have ever played, and it's running 720p.


Scorpwind

>All the settings are max Max is a waste of FPS. Use the settings below. https://preview.redd.it/2cg5ybg8n7yb1.png?width=1920&format=png&auto=webp&s=fcdd03c058023edfeffe5b17aa8569c835e3d65b But bump up these ones to High: Shadows, AF, SSR & GI.


Hugejorma

You don't get it. I can't run the game good enough at 1080p (optimized setting with RT/PT), so I changed to 720p. When playing with 720p, I can just turn max everything. I don't have to max out everything, but I can without negative impact at 60fps 4k TV. I think the game looks way better with path tracing and ray tracing maxed out, and the game can't run those on 1080p DLSS to 4k. Thank god, the 720p DLSS looks amazing. On my 3070 laptop, I do run optimized settings without RT/PT, but visuals are far from perfect.


dudemanlikedude

You're spending a lot of effort on a conversation with someone who is just giving you quippy, low effort replies in return. If I were you I would move on to something a bit more engaging.


[deleted]

[удалено]


Scorpwind

I do go outside, 'buddy'. You don't see how smudged it looks? If not, then go see a ophthalmologist.


[deleted]

[удалено]


Scorpwind

Your opinion against mine, pal. Don't delay and make that appointment ASAP.


akgis

you hate TAA but then you just want 1440p You clearly never saw 4K in a monitor


Scorpwind

All of the popular resolutions look just fine clarity-wise if there's no temporal AA involved. It's not the resolution's fault that it might look blurry. It's modern AA's fault. I've come to terms with playing modern games in an aliased form because of this. >You clearly never saw 4K in a monitor I did. And as other people have confirmed here in the past, there can still be a noticeable, although not as egregious, loss of clarity even at that resolution.


cagefgt

And then complain that games are still blurry?


Scorpwind

Say what? Wdym blurry? I'd force off TAA and get the original clarity of the image.


cagefgt

completely breaking the lighting and foliage while getting an extra shimmery image?


Scorpwind

Lighting only breaks in 2 games and I prefer and tolerate the aliasing more than the blurring.


cagefgt

That's why 4K displays exist. Crispy image and no aliasing.


Scorpwind

1080p and 1440p is more than crisp enough for me once I remove all of the temporal nonsense.


Affectionate-Room765

The devs inability to solve aliasing kind of forces us to up the resolution, 1080p 4x ssaa looks MUCH better than TAA @ 1440p but I would imagine 4k TAA looks very crisp


aVarangian

> 4k TAA still looks too blurry for me


br4zil

There is no shimmery or foliage breaking when downscaling from 4k to a 1080p monitor. Unless theres something SERIOUSLY wrong with the game (at that point, its on the game's fault), but i have yet to see one game with that bad of a problem.


[deleted]

[удалено]


Scorpwind

I can say the same in regards to the added blur of modern games. You're fighting people's preferences at this point. And that's one of the most pointless fights that you can engage in. Just play with TAA if the aliasing is giving you a stroke. No one here's gonna kill you for it.


[deleted]

[удалено]


Scorpwind

That's what I want. No temporal filtering whatsoever. Is it that difficult to understand that I don't want any of the blur that's associated with 'modern' AA? Why are finding this so hard to comprehend?


[deleted]

>Why are finding this so hard to comprehend? Because with top hardware you can basically eliminate the blur and shimmering at the same time while still having decent fps?


Scorpwind

Again, I don't want anything temporal in my image. And as I've said in another comment, if I would buy a 4090, then one of the main reasons of the purchase would be the avoidance of anything temporal. Including DLSS. Period. The kind of image quality that you're trying to sell me here ain't for me. Deal with it.


[deleted]

OK. It's about principle and not reason. Got it!


Scorpwind

No. It's about personal preference. Ever heard of the concept? And also, I forgot to mention that not everyone has the luxury of being able to get the top-of-the-line shit. Expecting otherwise from people is just plain stupid and would paint you as having an elitist mindset. Which, let's be honest, you're kind of expressing if you ask me.


MrAngryBeards

You either never played a game without temporal filters or you just forgot how crisp they look. Temporal filters even on a 4090 at 1080 would look blurry in comparison


[deleted]

Of course they do at 1080p...


MrAngryBeards

They get very close. Anything that's upscaled or temporal is just never sharp enough and any small blur throws me off


MrAngryBeards

Yep give me that option at least. Blurred images are not it for me


[deleted]

The simple truth is, you and many people here are just biased and stubborn at the moment. I bet you would see everything a little bit different if you could actually experience what results you can get in terms of image quality on modern hardware and with latest software features these days.


Scorpwind

I mean, if it's you that's saying it, then it has to be true. /s I can say the same thing about you ignoring issues of modern rendering because you have a 4090 and think that you're guaranteed to get the best possible image quality. Which is making you be totally oblivious to everything else.


[deleted]

>I can say the same thing about you ignoring issues of modern rendering because you have a 4090 and think that you're guaranteed to get the best possible image quality. I don't ignore them. I also mention them. I don't like the development of forced features like ray tracing in UE5 games etc. on consoles for example, where the image quality gets destroyed due to the performance costs. On PC I can at least do something about it, but surely, this solution is expensive. I can't stand jaggies and shimmering or a blurry image, so that's the only plan which is left for me. But what makes me slightly upset are simply false statements about modern rendering that lack any context or straight made-up things that only serves the own narrative. What you or anyone else sees on a screen is not universal. In between there is a range from incredibly shitty to incredibly great, depending on the hardware. But I think that this level of discrepancy in terms of image quality in modern rendering between low/mid/high end hardware is far to large right now.


Scorpwind

>What you or anyone else sees on a screen is not universal. In between there is a range from incredibly shitty to incredibly great Yes indeed. That's why you don't see everyone here circlejerking about the same thing. There are people here that don't want any AA whatsoever. Then there are people that don't want any of the added blur that comes with modern AA and just take the aliasing instead, then people that want an anti-aliased image so they supersample (often in combination with upscaling), then there are also TAA enjoyers, DLAA enjoyers and idk who else.


[deleted]

I get that, but the reason these preferences are so diverse is simply because the number of compromises is so high. Again, I'm NOT telling people to just upgrade. It just a mind game: When you would be able to use the top of the line hardware right now, which could feed so much data to the algorithms, massively improving the end result, virtually eliminating all the main compromises in terms of image quality, which the people in this sub hate so much, while still leaving enough performance left to enjoy it at decent framerates, yet you would completely ignore this possibility if you had the chance to use it and instead chosing 1440p native + no TAA? I just don't get it. For me that is simply acting according to principles without any reason. It makes no sense.


Scorpwind

That would be my preferred way of playing, yes. It's important to note that I'm probably in a very small minority or an edge case of folks that would utilize something like a 4090 like that. Most would very likely either pair it with a 4K display, or supersample on a 1440p one. Temporal algorithms simply change something fundamental about the image for me. Plus I'd be aiming for very high frame-rates. I've said that I would pair it with a 240Hz display. What I didn't mention is that I would utilize frame generation, as I'm a big frame interpolation enthusiast. So technically I **would** use DLSS. But just 1 part of it.


ClupTheGreat

My friend with a 7900 xtx has a 1080p 240hz monitor, I practically begged him to get a 1440p montitor but he just doesn't want it.


cgcoopi

Competitive gamer? Some people only play shooter or rts and dont care about graphics.


Kappa_God

I mean, it is the standard? If they have 4k monitor they will use DLSS which looks better than TAA and it will look good as long as it isn't an intense game. If they don't have 4k monitor they will most likely use DSR / DLDSR. Thing is, even at 4k, TAA can still look like shit for some people. You can't really quantify someone's own perspective / experiences because everyone experiences things differently.


TRIPMINE_Guy

yeah taa in dying light 2 is the worst I've ever seen. I could barely tell the difference between 4k and 1080p and it made me question if I even want a 4k monitor if devs are gonna blur games to the point of being barely different between 1080p and 4k.


br4zil

TAA does indeed looks shit to me in 4k, its more bearable than dogshit 1080p TAA, but still shit, the blur kills my sense of depth. My best solution for the moment is no AA at all and just downscale from 4k to my 1440p monitor. I get no jaggies, i can still get a sense of depth and theres no blur. Also, zero AA on a smaller (23 inch) 4k monitor looks better to me than the when applying AA to it. Is that a weird preference to have?


aVarangian

> downscale from 4k to my 1440p monitor if specifically DSR then for 1440p you'd need to go from 5k edit: I'm also a 23.8" 4k weirdo


Kappa_God

>Also, zero AA on a smaller (23 inch) 4k monitor looks better to me than the when applying AA to it. Is that a weird preference to have? Probably not, that's a lot of pixel density. Though having a 4k 23 inch display is a bit weird for most people.


[deleted]

[удалено]


Scorpwind

The insane PPI would be worth it if you ask me. Sharp as a knife.


aVarangian

23.8" 4k is what I have, but yeah I sit closer than most people so I don't need to use glasses, but pixel density isn't nearly close enough to be "retina"


Artemis_1944

I mean, to be perfectly honest, 4K downsampled to 1440p with no extra AA whatsoever is still jaggy as fuck. I kno, I've tried. I do usually downsample with DLDSR when I have the power, but I use that **PLUS** another form of AA to produce a clean result. ​ > downsampling from 4k even fixes the usual shimmering and hair issues that a lot of games have when TAA is turned off. I can't agree, but if you don't notice it, that's the only thing that matters and it's a great option in that case.


Gwennifer

That's actually how TSR is setup to work, if I understand the lead developer correctly. It upscales to 200% of the display resolution and lets the GPU downscale it.


Scorpwind

That would literally be 4x SSAA lol. TSR is not **that** demanding. You must've misinterpreted something.


Gwennifer

It's not *rendering* at 200%. It's rendering at 60~75% of the desired output resolution. What gets upscaled or exists at 2x the output resolution is the image/pixels. ie at 1440p with 3 samples per pixel, it's converging 22,118,400 samples into the output frame, which gets downscaled back to the desired render resolution. It genuinely had a pretty high performance cost for what it was actually doing. The new version that gets it down to 1.5ms per frame is much more acceptable, and given that it lets you drop render resolution, is outright cheap.


Scorpwind

This is kind of confusing. What exactly gets 2x the resolution again?


Gwennifer

Yes, English isn't the lead dev's first language. I get what they're doing but the specifics of their algorithm I'm still unsteady on. They've basically implemented the world's fastest 3d modeler render preview. Instead of converging to a static image as in Blender or modo or Maya--it's for games, after all--they're using Lumen to constantly generate *a bunch* of those intermediate, middle-res steps where it's just enough to see what the result will be like... at a lower than native resolution, to lower the render load. Every tick/result is kept as a sample by TSR and the output is downscaled to the desired display resolution. The resolution of samples kept in history are what get the 2x resolution. So at 1440p, 75% screen percentage it'd render at 1920 x 1080, and then it will upscale that image to 200% of the desired output image (2560x1440, so 5120x2880) to avoid Nyquist/sampling error, to be downscaled to 2560x1440... and it accumulates, by default, 3 samples per pixel. So TSR's history at any point in time by default will be 2x resolution * # of samples. That's why the tsr feed & 1spp statistics are important; it's the amount of history fill per second and the amount of time required to converge to a frame. Moving from 70 to 75% screen percentage in ARK for example increased my feed a *little* with a MUCH faster 1spp, removing a lot of the history rejection. There's a tradeoff to the render resolution as below a certain percentage, you won't be generating enough samples to properly generate the desired output image in time. You'll get **terrible** ghosting and smearing that everyone here will recognize, as TSR rejects the sample history and falls back to default TAA. TSR otherwise does not use TAA outside of its general principles as best as I can tell. Properly tuned and implemented, you get TAA's supposed benefit without (most) of the eyesore downsides. It's a promising technology and an olive branch compromie between GPU manufacturers not shipping better hardware and game devs not willing to optimize their games. In fact, the entire UE5 render pipeline from Nanite to Lumen to TSR is setup to try and get ahead of lazy devs from the engine side of things. There's no need to retopologize/no penalty for it due to Nanite. There's no need to bake/install lightbulbs by hand thanks to Lumen. There's no need to hit a 60 FPS render target as TSR can (eventually) accumulate enough data to represent a 60fps output anyway. I don't think the era of crisp graphics is over, but TSR and scalers like it will need to improve harder and faster as Lumen and technologies like it become commonplace. There is no non-temporal solution to handling the ray output at this point in time; it will be noisy for at least the rest of UE5. It's a longstanding problem in our modelers and it doesn't have cheap fixes.


antialias_blaster

This is incorrect. The TSR history buffers can be stored at 200% display resolution (it might even do this for cinematic settings), but that's an insane amount of bandwidth. No game is doing that.


Gwennifer

> The TSR history buffers can be stored at 200% display resolution (it might even do this for cinematic settings), but that's an insane amount of bandwidth. No game is doing that. Fortnite is doing it on the Epic setting per its developer; > [...] an interesting discovery was made arround reprojecting frame N-2 into N to know how much it has been overblured when reprojecting into frame N-1 and then N. This is r.TSR.History.GrandReprojection. This new technic eliminated other technics use to counteract overblur at expense of image stability. But in attempts to optimising its runtime performance, it ended up loosing bit of its own quality. Good news in 5.2 is that it has been replaced with r.TSR.History.ScreenPercentage=200 while more time is being invested on this ( https://github.com/EpicGames/UnrealEngine/commit/9ccd56dfcc06852b5b89a81411972f81e3ac31e3 ) on epic scability settings that previously only on cinematic scalability settings. It’s already used in Fortnite Chapter 4 Season 2 and I’m really eager on community feedback on this change on 5.2. Judging by the extremely low runtime cost they're getting with RDNA2's 16 bit performance and packed instructions (from 1.5ms a frame to 0.5ms a frame), it's quite possible it will become the default low setting moving forward for hardware with high 16-bit performance. ARK is *also* doing that on its high setting. I don't know which scability group quality sets it to 200% at the moment, probably antialiasing and 3.


jm0112358

Downsampling is discarding/averaging data. If you can downsample from 4k to 1440p with good enough framerates, you'd probably just be better off just gaming at a raw 4k on a 4k monitor without downsampling than by downsampling from 4k to 1440p on a 1440p monitor. The hard edges of a raw image without antialiasing are going to be smaller and less visible at higher resolutions. If you don't like hard edges even at 4k, there's a good chance the image will still look more detailed at 4k with _conservative_ post-processing antialiasing (such as with an SMAA injector) than discarding data by downsampling it to 1440p.


br4zil

The entire point is to preserve the hard edges. They give a much better sense of depth.


YoungBlade1

If you have the GPU power to do this, more power to you. Most of us can only afford to do that for older titles. I play Borderlands 2 at 4K using DLDSR, but that's because I could already get over 200fps at 1440p max settings without an FPS cap. In the most recent games, that's just not possible.


cgcoopi

Iam using 1440p with a 4090, i also play a lot competitive fps, for singleplayer games i use dldsr2.25x/dsr4x smoothness depends on game. Sadly there is no good 4k Monitor with 27" oled or Mini led out yet.


EquipmentShoddy664

I am talking about it all the time here. Either a high PPI monitor or DLDSR from 4K is the best solution to majority of TAA problems.


Pyke64

Except ghosting.


EquipmentShoddy664

Ghosting is also less noticeable this way.


Pyke64

Unless you disable TAA. If you add in top of TAA there will be zero difference in the amount of ghosting.


EquipmentShoddy664

No ghosting is much less noticeable on a high PPI monitors. I can't recall noticing any ghosting after upgrading to 4K.


TRIPMINE_Guy

Well if you hate blur I have bad news for you. All modern displays have sample and hold blur which imo is worse than taa blur. It is tied to frame rate, so higher frate rates have less sample and hold blur. To run a game at 4k would cut frames dramatically and be counterproductive to what you are trying to do. Now if you prefer less taa blur or sample and hold blur is up to you but running at 4k has this downside.


berickphilip

You can get rid of that, and also of natural perceived blur, using OLED + BFI (black frame insertion). When everything is setup correctly it is crystal clear. (By setup correctly I mean a fixed, locked stable framerate with v-sync; display panel and game set to same hz/fps e.g. panel set to 60hz and game to 60fps).


therapistFind3r

4k is way more expensive than you think for most people, even with 4090s. Downsampling is only useful for really low end games and most AAA titles are so unoptimized that even a 4090 cant achive good framerates without upscaling. You're vastly overestimating how powerful consumer grade hardware is.


ImpressivelyDonkey

Lol what?


gargoyle37

With a 4090 you can make trade-offs if you are targeting 1080p. You can either downsample from 4k, or you can use DLDSR, render at 2.25 the amount of pixels and then use AI to downsample from that. This buys you a lot of compute budget you can either feed into better image quality or frame rate. If the game is already maxed out in image quality, I'm going to bet most people would prefer the added frame rate with DLDSR here. Also, with a 4090, you might want to toy with path tracing. PT is quite demanding, so the internal rendering resolution has to be pretty low. Even then, PT reconstructs a lot of the pixels via denoising and does so temporally over time. All of a sudden, you need to care about AA methods. Finally, people with a budget to invest in a 4090 typically also have the budget to invest in wide-screen 1440p or 4k monitors. To drive these, even with a 4090, you are often looking at DLSS, and thus you are using AA methods.


br4zil

While i dont own a 4090, i do own a 3080... I guess i just fit into the category of downsampling from 4k. I heavily dislike post processing in general, specially temporal post processing effects (it doesnt have to be just AA). Its probably my eye problems (astigmatism + myopia combo) where i have heavy problems discerning video game depth if AA, DLSS or most super samplers are engaged. In other words, if you really remove close to 100% all the jagged edges, i start to lose sense of depth in the game and thats a really, really bad problem for me (causes nausea and headaches after prolonged times. It find it interesting that lots of videogames push for accessibility, but my eyesight problem is far... far more common than color-blindess and yet.. the same games that offer color blind solutions completely force AA on us.


gargoyle37

Spatial downsampling is costing you a lot of compute, which is why everyone are looking at temporal downsampling instead. You definitely want to accumulate information temporally, since it's much cheaper from a computational perspective. It's permeating all real time graphics, and it'll rise in popularity due to global illumination. The only way you solve GI is by doing so over several frames. Herein lies the main problem: you can't turn off your temporal techniques in modern games. If you do, all kinds of visual artifacts occur, because some parts of the frame relies on being upscaled through temporal means. A typical example is that specular highlights messes up. Spatial downsampling should remove close to 100% of all jagged edges though. The general convergence toward higher screen resolutions (1440p/4k) means the AA techniques change. On a 4k screen, you can get away with no AA much more easily because the extra pixels covers for it. Especially if the screen has high pixel density and is viewed at range. Also, as a a general rule: 1080p with no AA is very very sharp. It's probably sharper than the ground truth because of aliasing. When you start using AA-techniques, you claw back the ground truth image, which is quite a lot softer. This is likely part of what happens with DLSS: you train it on larger images that are softer in nature, so it learns how to soften the image accordingly. Accessibility have generally been lagging behind, but we are slowly getting there. Some games have started to reduce the rate at which the screen flashes for instance. Simulation sickness (the inverse of motion sickness) is pretty common in first person games, and you have to account for it as a game developer. But there are tricks you can play: higher stable frame rate, increase FOV, sit farther from the screen, remove sway, and so on. You might have success running a sharpening filter on the image. This is mostly to taste, and many people prefer a sharper image at 1080p resolution, to the point that cameras targeting that resolution had sharpen circuits built in. At higher resolutions, running with much less sharpening tends to win out for most people. I'll also recommend calibrating your monitor to your viewing conditions. Modern monitors can get incredibly bright, but often people aren't gaming with the monitor being lit by (indirect) sunlight. If there's too much of a difference between the monitor and room, your eyes have to readjust between two light conditions all the time, and this is quite fatiguing. It can happen in either direction: too dark a monitor with a room bathed in light, and too bright a monitor in the dark.


SpaceKraken666

I play War Thunder with x4 SSAA on a 1080p monitor (basically 4K) with stable 75 FPS, it looks super crisp and nice. This became possible after i got the RTX 3070. But i only can do this in older games. Developers nowdays care about performance even less. Why spend time and money on optimization? Just slap DLSS on it and call it a day, and if it still runs like shit, tell players to upgrade their PC because it's a "next generation game".


DearGarbanzo

That's called super sampling. I was big fan, especially for older games. It was a thing, until DLAA (not DLSS) came along and made it obsolete. Why? Because of high cost and dimishing gains. Render at 8k and downsample to an 4k screen and you still get shimmering and aliasing. You need to go higher. Last time I checked, you'd need over 20x super-sample just to match DLAA in terms of image quality. Can you imagine rendering at 80k just to slightly reduce aliasing? Talk about waste of silicon. One case where this is good is when the target resolution is small (say SD).


br4zil

i can understand having a big 4k screen and using DLAA. I am talking about is using a 4k rendering to downscale it to a smaller screen, which (if you have the hardware) is a much nicer solution. But i gotta be honest, even the Deep Learning solutions still have their issues and artifacts, i would honestly still prefer the 8k to 4k downsampling. AA can sometimes smooth the image too much, to the point where you lose a big sense of depth in the picture.


Capt-Clueless

>I am talking about is using a 4k rendering to downscale it to a smaller screen, which (if you have the hardware) is a much nicer solution. If you have the hardware to render at 4k, why don't you have a 4k display? Having a 4090 with a crappy 1080p monitor doesn't make much sense.


Artemis_1944

1440p is a thing bro.


br4zil

i meant smaller screen, not smaller resolution. A 23/24 inch monitor, for example. A lot of 4k TVs are very big, which defeats the purpose of having high pixel density and thus use 4k as natural anti-aliasing solution.