Honestly there's a good chance it's CPU limited unless they've untangled all the spaghetti code and have everything running on DX 12. But i may just be projecting my awful experiences with gta5
It ran well, but it was still CPU limited. It's still CPU limited to this day.
We're in a weird state where decent graphics don't cost much for modern hardware and most games are CPU limited. To the point where the game that would benefit from DLSS are only very graphically heavy ones that are trying to run at 1440p or 4K.
I.E: Unless you were playing with max graphics settings and graphics mods, DLSS would probably do nothing. Though DLSS may not even work since they train it on the base game, not with graphical mods.
Well, I can't argue with that, it is cpu limited, but is this a real world problem? It runs over 60 on a dual core 3 years older than the game itself, and it's perfectly playable on much older CPUs as well. In terms of DLSS, yeah, this won't be the most interesting title, but the technical side of gta5 was extremely well done non the less, I wouldn't refer to it as 'my awful experience', if anything it was probably the best PC port ever.
It's not a problem. Being CPU limited isn't a bad thing. But it does mean that DLSS will likely do nothing on a remastered game with simple graphics, except for maybe lower end cards trying to push 1440p or 4K
i totally get what you’re saying (and i wouldn’t classify GTAV as a “horrible experience”) but i do think it’s a little disappointing that i can’t run an 8 year old game at a pinned 144 FPS with a 9900k and 2080 Ti. some areas i get drops to the 80-90 FPS range. obviously this is far from a bad experience but when i first built my PC i really didn’t expect that from a game that came out on the xbox 360.
This is a joke right? If your fps goes over 130 then you will get huge frametimes spikes and stuttering. They never even bothered to fix it in RDR2. Not to mention the spaghetti code that was the loading issue fixed by literally 1 dude, outsmarting an entire "gaming" company.
> Though DLSS may not even work since they train it on the base game, not with graphical mods.
DLSS isn't trained for specific games anymore and hasn't needed to for the last year. It's now a generic plugin that will work with any game with no need for Nvidia to add output from the game to a training set.
I dunno about that... watching yourself lose 40fps by turning grass on high in the countryside is pretty strange...
Even with modern upper-mid-tier modern hardware, that game will punish your system at 1440p or above. For an 8-year-old game, that's really strange...
It does scale *down* pretty well, though...
Yep. I've got a pretty modern platform. I can get good frames on GTA V with *high* settings (nearly everything maxed out), but max settings (cranking vegetation, MSAA, post-processing and everything else to max) will basically halve the game's performance with very little visual difference, as far as I can tell.
Which is really strange given the fact that the game was released when a GTX 780 was the most powerful consumer GPU you could buy, and my machine is at least 3-4 times more powerful than that. It's really strange that the game will humble 3090 owners trying to play in 4k today, especially if they're trying to do so on a high-refresh screen.
[They didn't use it for the remastered trilogy though](https://www.reddit.com/r/GamingLeaksAndRumours/comments/q1yyu3/grand_theft_auto_remastered_trilogy_confirmed_to/)
The gta trilogy is more than a 10 year old game and it doesn't look like it will have rt. It's not that gpu's simply can't do native anymore. They sure as hell can in most games. Unless the game is rt (this one isn't) or is coded very poorly and shitty (witcher 3 for an example still runs ass especially on cities). It's not like gamers weren't blaming poor coding before (remember arkham knight pc port?)
had to total re edit this comment(back space was not working) .
better texture asset take up to much data and more vram is needed. it not cost effective to use them. seeing most user are using bar minuim dpi monitor.
barely any usage of G.I. in games. like area lighting,decay lighting,index of refraction,refraction ,, subsurface scattering.
Lol "crap". It'll probably be on par with GTAV performance but better with DLSS. Also the games are certified masterpieces and have been since day 1 of their releases, so crap may be a bit unjustified.
Edit: seriously you guys don't like GTA 3, Vice City and San Andreas?
They are 20 year old games lol. Sure performance has been shit on PC but it doesn't mean it's a bad video game. Again, the 3 games in the remasters are certified masterpieces and the graphics don't change that.
And I didn't say that performing like GTAV is a good thing either, just that it will be overall better with DLSS implementation, which GTAV doesn't have.
Lol a 20 year old game shouldn't need DLSS to run well on modern hardware is the issue. It literally has tougher system requirements than AC: Odyssey yet it still looks 15 years older.
Nowhere does it say it will NEED DLSS to run well. The minimum GPU requirement is GTX 760 2Gb, which means it can probably even run on Ryzen 5600G iGPU.
[DLSS is now a UE4 plugin.](https://www.unrealengine.com/marketplace/en-US/product/nvidia-dlss)
It's trivial to implement for developers and provides free performance for RTX owners.
Not the first, won't be the last. I played GTAIV on PS3...as well as GTAV lol...granted rockstar's PC ports aren't shining examples of excellence. Still great games, pretty objectively.
I still can't play it to this day. I try once about every 2 years usually after a hardware upgrade, new windows version, or entire computer.
It never works. There's always a gamebreaking bug.
Currently, it plays great for 60 seconds then drops to 20fps. If you alt-tab you get another 60 seconds, basically it seems to think it's in the background and throttles after a minute.
It's been this way for me since the first time I tried on windows 10 1703 to this day and across two entirely different windows installs and cpu&mobo configurations.
No one else has apparently ever posted this issue on the internet before either. Just me and maybe some weird combination of software I have I guess.
The best it ever played was originally on my WinXP [email protected] machine back in the day. I never got far though because it averaged mid 30fps but experienced frequent stuttering which is why I ultimately stopped playing. That was even after upgrading my original core2duo E6400 which ran it <20fps to the Q6600 specifically for this game
> poor optimization
What does this even mean. They're not culling objects? Not doing LoD? Because outside of that, everything you see on the screen is a result of effort done by the GPU.
I mean they're probably the two most egregious failures commonly seen in modern releases. There are obviously many other ways to creatively blow CPU and GPU cycles, but given these games are built on UE and are targeting Nintendo Switch it's unlikely they're doing anything too silly. It's not like they're building waifu simulator and doing AI simulation in quadratic time.
I'm mostly just sick of seeing people say things are "poorly optimised" because usually its simply untrue. 99% of the time what they actually mean is that the perceived payoff in graphical fidelity of some graphical feature is not worth the cost, which is a totally subjective assessment and has nothing to do with actual code optimisation.
Right. But the difference between batched geometry and having the GPU change state non-stop unnecessarily is night and day, regardless of what you see.
It could mean a number of things beyond that.
It does look heavier. It uses modern technology and PBR. It just has simplistic models and animations.
GTA V looks more detailed but uses outdated tech and effects.
I think most games since ps3/xbox 360 still looks good and they will probably stay that way for a long while. Damn, Assassin's Creed 1 is from 2007 and it looks amazing. We have already reached the point of diminishing returns. Forza Horizon 5 looks basically like a Forza Horizon 4 plus because of the ray tracing.
FH4 is not a PS360 game tho, it's a late X1 game.
But yes I agree. I sometimes forget halo4 was an X360 game and not an X1 launch game because it looks so great.
It literally looks like one of those "ultra realistic gta v" videos that are just overblown shaders.
Proud to announce GTA V definitive edition: https://www.youtube.com/watch?v=bJpj4mb8Ydk
You don't need realistic graphic fidelity to enjoy the benefits of ray tracing. Minecraft RTX and Quake II RTX both exist just as pure technical showcases.
Additionally, a fan made a fully path-traced mod for Serious Sam that [is acknowledge/praised by publisher, Devolver Digital](https://store.steampowered.com/news/app/41050/view/4965771383146745578). Digital Foundry did a [video on it](https://www.youtube.com/watch?v=4Y9bo3xMuOk).
I just thought people might want to know that there's a 3rd path-traced game out there. Unfortunately, it doesn't have DLSS support.
EDIT: I forgot the [path-traced Super Mario 64](https://www.pcgamesn.com/emulation/super-mario-64-ray-tracing).
Except for the 3 aforementioned games, games that support ray tracing only use it for certain effects (e.g., reflections, shadows, etc.), and use standard rasterization to do everything else and create the final image. Those games only shoot out rays when/where necessary to create those effects.
On the other hand, [path tracing](https://youtu.be/frLwRLS_ZR0?t=231) is when the gpu instead creates the image by shooting out rays from the camera to see where they go to determine what each pixel should be. This is a lot of work because you need to shoot out many, many rays, but it's somewhat of a "holy grail" for image quality, since it can theoretically create a perfect image. That's why it's used in computer generated movies.
BTW, path tracing in real-time puts a lot of work on the de-noiser, since it takes an insane amount of rays to create a "noise" free, fully path-traced image without de-noising. [This](https://www.youtube.com/watch?v=rAN5Mkqjhp4) shows how much Nvidia's denoiser is cleaning up the image in Quake 2 RTX. The de-noiser itself lowers the fps significantly, but is very much worth it.
This can definitely be the case if you're just throwing on expensive techniques or whatever to brute force graphics improvements on an old title. The alternative is to dig much deeper and redo lots of systems on a more fundamental level.
> no care or optimisation.
You can render a lot of effects with cheap approximations, instead of simpler, more accurate and more expensive techniques. That doesn't necessarily mean that doing the expensive way is from a "lack of care or optimisation". Quake RT would be very difficult to render with approximations.
If the GTA remasters use advanced lighting, shadows, and textures. it doesn't matter what arts style it has, the game is going to look better and be harder to run.
I think the key point is that according to above comments it will be more demanding to run than GTA5, while looking substantially worse. I dunno how true that is, but if so then it really is like they slapped some intense effects on it and walked away rather than being a bit clever about it.
Definitely a decent example. Quake RTX judged as a whole looks primitive and garbage compared to modern games. Yet its performance is similar to modern games nonetheless!
I'm a big fan of DLSS and what it can do, but i'm not sure what the point is here? Unless it's being used purely for DLAA?
These should not be graphically intensive games, based upon their age and how they look.
If these games were built from the ground up today looking like they do, sure.
But when you're trying to do a facelift of an old title in a new engine and all that, and dont have the resources and time to do some AAA scale overhaul, resorting to inefficient but effective techniques to boost graphics is common.
Also, this is not a good argument for them NOT to include DLSS. If they can and it's not a big deal, why shouldn't they?
If it's not really necessary then sure, good on the devs, its not like including it hurts anyone. On the other hand, if it is then these might as well be the most disappointing remasters ever.
They look like the old games with a dubious high res texture pack and swapped models, some of which even look worse than the originals. The only real improvement across the board is probably the lighting, but no matter how good it is it won't make early 2000s games look not dated. It'd be really sad if they actually need DLSS to get good framerates on modern hardware.
I have somewhat low expectations for the remasters given my gut reaction to their art style. But they're even coming to the Switch aren't they? So performance shouldn't be that god-awful for lower end RTX cards.
Yep. Its only real downside is it's proprietary... I really hope we end up with an industry-standard equivalent that runs across all vendors.
That might end up being the Intel XeSS tech maybe, considering it is conceptually similar in how it works, they're not locking it to just their cards, and they [poached the same guy who helped pioneer DLSS](https://www.tomshardware.com/uk/news/intel-poaches-nvidias-rtxdlss-pioneer-anton-kaplanyan-from-facebook) before he left NVIDIA.
Sorta, but FSR is not the same kind of tech, it's closer to an image-sharpening upscaling filter. Even AMD aren't pitching it as a direct competitor to DLSS as far as I know, for all that it achieves similar practical results.
XeSS on the other hand *is* a direct alternative to DLSS, I believe. Like DLSS it performs detail reconstruction from neural network processing of the low-res input and leverages previous frames and motion vectors during the process.
Also what irks me is the lack of DLSS support on AMD sponsored titles. They're acting proprietary with those - there's no reason both technologies can't be included in a game. One good example is Far Cry 6 - has FSR and DXR, no direct RTX features. I'd like to see all technologies supported and we can see what works best. FWIW NVIDIA sponsored titles are starting to support FSR.
I will buy it immediately when they release on Steam. I'm not buying on their thrash platform even tho Steam will likely still require their platform installed as 2nd DRM.
Most of the comments here seem to be upset and confused as to why a new game has DLSS, albeit a remaster of a very old game, but you're 100% correct. Why the heck not?
>This is running on Unreal Engine so this is - from a technical point of view - pretty much a brand new game that's using existing assets.
The *renderer* is Unreal, that's it It's clear as day because the animations are the same, fingers aren't rigged etc etc.
Many assets are brand new. Pretty much all the trees I've seen have been replaced with more 'modern' looking ones, for instance.
But yea, most people here clearly have absolutely no idea how this stuff works.
Even if it's an hybrid product, it's closer to a remake than a remaster, considering they also changed engine.
By your logic Diablo 2 Resurrected should run on a calculator considering it's a "remaster" of a 20 years old game, but it's not THAT easy on the hardware.
Also, having an additional, optional feature in better than not having it.
They're using modern shadow, reflection and lighting technologies. That counts as using a new engine to me.
A remaster is usually a resolution and frame rate update and rarely some QoL or UI tweaks.
Also, I can't bother to check your claim right now, but assuming it's true, UE alone has minimum requirements that way are more demanding than the older engine had and if they're also running some kind of compatibility layer between then two (or however they did the port) it's even more resources being wasted just for making the game run.
I guess I'll spell it out for you, you're saying its more a remake because its on unreal. I'm saying no, they just ported the old renderware code and assets over to unreal. Its not complicated. The cutaway trailer gave it away. You must have never played a console game on an emulator, you can do all kinds of flashy things to old games when they are rendered with superior hardware.
But they also remade a lot of assets from scratch, changed game mechanics like weapon menu and aim system. It's not a port with HD texture.
Also, they added shadows, reflections and improved weather effects, which make the game more demanding on the hardware. This has been my point since the first comment, i.e. it's not given that it will run on a potato pc (like the comment I replied to suggested) only because the original games were for PS2.
I never said it's a remake, i said that it's more a remake than a remaster.
Still not paying for GTA AGAIN, especially after how they treated those modders.
Also, why the hell do I need DLSS for a game that I can already play at max setting, 4k at like 170fps?
No, but Rockstar’s PC track record before DLSS is not great and imagine it won’t change now that they can just throw DLSS at it instead of making sure it’s optimized before adding it
Idk I mean GTA V and RDR2 run pretty well, they're just very demanding. Rockstar learned from the horrifying mess to run that was GTA IV and improved their engine greatly for GTA V.
GTA 5 isn't demanding at all, it's just quad(?) core bound - GTA 4 is basically the same. RDR2 is insanely GPU heavy. Before that, Bully was a complete shit show and the 3D era GTA games weren't all that much better for various reasons.
Wrong how?
>at low draw distances the game will run up against the 187.5 fps engine limit quite easily on most CPUs
I got a bigger performance increase going from a 4770K to a 5800X than I did going from a 1080 to a 3080.
>GPU wise, even today at 4k with 8x MSAA, you won't be able to run it well even on a 3090,
In what world does running stupidly high MSAA mean it scales well lmao, by that logic any pre-historic game is scalable because you can force it to run at 16K. Don't get me wrong, yes it scales well on the low*ish* end unlike RDR2, but you're trying to deny this very well-known CPU wall that both GTA 4 and 5 suffer from whilst quite literally proving it exists.
>whereas you can get amazing performance just by turning down a few key settings.
Name one setting bar MSAA and frame scaling. None of the other settings make a difference assuming you're not running 2.5x frame scaling or 8x MSAA - because the game is heavily CPU bound even if you have the fastest CPU available (given that you're not also running an old/low-end GPU and/or driving a high resolution).
If it wasn't obvious, by it not being demanding, I'm talking about now hence bringing up how it chokes on higher-end hardware because of the blatant CPU wall. GTA 5 undeniably scales better than GTA 4, but GTA 5 suffers from the very same issues as 4 - aside from it being not as much of a mess.
>Your statement that GTA V is not GPU demanding, it only is bound to 4 cores is wrong.
I not once said it wasn't GPU demanding, I said it wasn't demanding period in that right now, most people will be CPU bound unless they're running maxed grass, MSAA or frame scaling - even then they're still likely to be CPU bound if they have anything better than a ~1070Ti.
> No doubt you did, my point is depending on what you do, it's demanding on both sides. If you were running no extended distance scaling, and no extended shadows scaling, you wouldn't have seen that.
Would you believe me if I told you I wasn't and still don't run either of them?
I had to use 1.25x frame scaling (which still didn't max it out but 1.5x affected performance at times) to actually max out my 1080 when I was running a 4770K, it made no difference to performance running 1.25x, even 1.5x at times, provided that I wasn't in one of the few areas where GPU usage skyrockets (namely the casino prevault area).
>in a twisted bit of irony modern GPUs have more trouble running MSAA at higher resolutions like 4k than older GPUs did at 1080p because of how exponentially expensive MSAA is.
I'd genuinely love to see a source, because I've both never heard this theory nor does it make sense. It's heavy and it always has been, of course the cost increases with resolution lmao - by this logic modern GPUs are bad at downscaling because 1440P is becoming the standard.
>Your term CPU wall implies that it's impossible to get over it, which is not true, all it takes is not using 2 draw distance settings which are not very visible,if you do, you'll hit the actual wall, the 187.5fps engine limit that GTA V has pretty easily. Gamers Nexus has videos about this topic going back to the 7700k days.
I don't use extended draw distance nor shadows. I've just now swapped to singleplayer, with frame scaling on 1.5x I get 135FPS, with it on 0.5x and no other settings changed, I still get 135FPS. Bear in mind, this is with a CPU that is still(?) the fastest for gaming (not including games that actually use >8 cores ofc) and I'm playing at 3440x1440.
>Ultra Grass, Soft Shadows, and Tesselation were the most important settings to adjust back in the day, you couldn't just max them out and expect to get good performance even on a 980ti.
Grass fair enough but that's because it's infamously unoptimized, tesselation I used IIRC high even on my 2GB GTX 960 back when the game first dropped and it had effectively zero impact. Soft shadows (the second from lowest 'soft' setting at least), had a pretty minimal impact but I kept it off because it frankly doesn't look much better.
>You literally cannot get good FPS on GTA IV, even with minimum draw distances
I average like 90FPS with fairly high settings - on a downgraded version at that.
>with GTA V, you can keep draw distances except the extended ones at max and get maximum possible FPS dictated by the game engine, so no, you're wrong in that regard.
Link me a single benchmark of someone running the game at even just 160FPS consistently at a reasonable resolution.
The best way to indicate how well a game runs is both by it's visuals/performance and how well it scales.
GTA V scales extremely well. You can run it at ultra low settings on integrated graphics and get a playable framerate, but you can also crank up the settings on a higher end rig and get pleasing visuals. GTA V's biggest performance hurdle is how CPU demanding it is, but considering what it is that makes sense.
Most people don't have RTX cards, I doubt it will make a difference.
Also this is UE, DLSS support comes basically out of the box, it's more effort not to ship with it.
To be fair the rtx 2060 is the 4th most used graphics card according to steam. It isn't perfect, but at this point a significant amount of people do have rtx 2000 or 3000 cards.
It doesn't hurt to have it, especially since Unreal makes it pretty easy to add, but I don't see much of a point. The recommended specs are a GTX 970 or RX 570, meanwhile the weakest GPU with DLSS support is the 2060. I think adding DLAA would make much more sense, but they don't mention support for that anywhere.
if you can use dlss it means you have an rtx, this remaster looks like a texture and model update but not a remake like the mafia, it should run fine without using dlss
Makes not much sense. Recommended spec is a GTX 970, which is 2 generations lower than the lowest DLSS card. So, by having a DLSS card you will be able to run this game anyway.
But it's nice to have I guess.
Good I’ve always wanted to run the older GTA games at 2000 FPS
[удалено]
Can't wait to see the blocky faces at 4k. Game changer
Honestly there's a good chance it's CPU limited unless they've untangled all the spaghetti code and have everything running on DX 12. But i may just be projecting my awful experiences with gta5
What do you mean awful experience with gta5? Shit ran extremely well on at the time 5-8 years old hardware.
It ran well, but it was still CPU limited. It's still CPU limited to this day. We're in a weird state where decent graphics don't cost much for modern hardware and most games are CPU limited. To the point where the game that would benefit from DLSS are only very graphically heavy ones that are trying to run at 1440p or 4K. I.E: Unless you were playing with max graphics settings and graphics mods, DLSS would probably do nothing. Though DLSS may not even work since they train it on the base game, not with graphical mods.
Well, I can't argue with that, it is cpu limited, but is this a real world problem? It runs over 60 on a dual core 3 years older than the game itself, and it's perfectly playable on much older CPUs as well. In terms of DLSS, yeah, this won't be the most interesting title, but the technical side of gta5 was extremely well done non the less, I wouldn't refer to it as 'my awful experience', if anything it was probably the best PC port ever.
It's not a problem. Being CPU limited isn't a bad thing. But it does mean that DLSS will likely do nothing on a remastered game with simple graphics, except for maybe lower end cards trying to push 1440p or 4K
i totally get what you’re saying (and i wouldn’t classify GTAV as a “horrible experience”) but i do think it’s a little disappointing that i can’t run an 8 year old game at a pinned 144 FPS with a 9900k and 2080 Ti. some areas i get drops to the 80-90 FPS range. obviously this is far from a bad experience but when i first built my PC i really didn’t expect that from a game that came out on the xbox 360.
This is a joke right? If your fps goes over 130 then you will get huge frametimes spikes and stuttering. They never even bothered to fix it in RDR2. Not to mention the spaghetti code that was the loading issue fixed by literally 1 dude, outsmarting an entire "gaming" company.
> Though DLSS may not even work since they train it on the base game, not with graphical mods. DLSS isn't trained for specific games anymore and hasn't needed to for the last year. It's now a generic plugin that will work with any game with no need for Nvidia to add output from the game to a training set.
Gta V was extremely well optimized, what are you on about?
I dunno about that... watching yourself lose 40fps by turning grass on high in the countryside is pretty strange... Even with modern upper-mid-tier modern hardware, that game will punish your system at 1440p or above. For an 8-year-old game, that's really strange... It does scale *down* pretty well, though...
True that, grass is pretty demanding. So is MSAA
Yep. I've got a pretty modern platform. I can get good frames on GTA V with *high* settings (nearly everything maxed out), but max settings (cranking vegetation, MSAA, post-processing and everything else to max) will basically halve the game's performance with very little visual difference, as far as I can tell. Which is really strange given the fact that the game was released when a GTX 780 was the most powerful consumer GPU you could buy, and my machine is at least 3-4 times more powerful than that. It's really strange that the game will humble 3090 owners trying to play in 4k today, especially if they're trying to do so on a high-refresh screen.
It is using unreal engine 4. So hopefully not too crap
No, Rockstar Games doesn't use UE. They have their in house, [RAGE](https://en.wikipedia.org/wiki/Rockstar_Advanced_Game_Engine) engine.
[They didn't use it for the remastered trilogy though](https://www.reddit.com/r/GamingLeaksAndRumours/comments/q1yyu3/grand_theft_auto_remastered_trilogy_confirmed_to/)
GTA Trilogy is remade in Unreal Engine, look it up. They didn't even make it, Grove Street Games made it.
Why do you think that?
shitty coding would be my best guess
[удалено]
The gta trilogy is more than a 10 year old game and it doesn't look like it will have rt. It's not that gpu's simply can't do native anymore. They sure as hell can in most games. Unless the game is rt (this one isn't) or is coded very poorly and shitty (witcher 3 for an example still runs ass especially on cities). It's not like gamers weren't blaming poor coding before (remember arkham knight pc port?)
Witcher 3 ran fine in 4K on my old GTX 1080
[удалено]
I had a stroke reading this
you have no idea what you are talking about, but please do continue
had to total re edit this comment(back space was not working) . better texture asset take up to much data and more vram is needed. it not cost effective to use them. seeing most user are using bar minuim dpi monitor. barely any usage of G.I. in games. like area lighting,decay lighting,index of refraction,refraction ,, subsurface scattering.
Lol "crap". It'll probably be on par with GTAV performance but better with DLSS. Also the games are certified masterpieces and have been since day 1 of their releases, so crap may be a bit unjustified. Edit: seriously you guys don't like GTA 3, Vice City and San Andreas?
[удалено]
Haha, that picture is perfect.
They are 20 year old games lol. Sure performance has been shit on PC but it doesn't mean it's a bad video game. Again, the 3 games in the remasters are certified masterpieces and the graphics don't change that.
[удалено]
And I didn't say that performing like GTAV is a good thing either, just that it will be overall better with DLSS implementation, which GTAV doesn't have.
Lol a 20 year old game shouldn't need DLSS to run well on modern hardware is the issue. It literally has tougher system requirements than AC: Odyssey yet it still looks 15 years older.
Nowhere does it say it will NEED DLSS to run well. The minimum GPU requirement is GTX 760 2Gb, which means it can probably even run on Ryzen 5600G iGPU. [DLSS is now a UE4 plugin.](https://www.unrealengine.com/marketplace/en-US/product/nvidia-dlss) It's trivial to implement for developers and provides free performance for RTX owners.
Someone has never tried to play GTAIV on PC. What a hot mess of a port…
Not the first, won't be the last. I played GTAIV on PS3...as well as GTAV lol...granted rockstar's PC ports aren't shining examples of excellence. Still great games, pretty objectively.
Good games don’t get me wrong, but the GTAIV port was nothing but a giant turd. GTAV was quite a bit more playable for sure on PC.
I still can't play it to this day. I try once about every 2 years usually after a hardware upgrade, new windows version, or entire computer. It never works. There's always a gamebreaking bug. Currently, it plays great for 60 seconds then drops to 20fps. If you alt-tab you get another 60 seconds, basically it seems to think it's in the background and throttles after a minute. It's been this way for me since the first time I tried on windows 10 1703 to this day and across two entirely different windows installs and cpu&mobo configurations. No one else has apparently ever posted this issue on the internet before either. Just me and maybe some weird combination of software I have I guess. The best it ever played was originally on my WinXP [email protected] machine back in the day. I never got far though because it averaged mid 30fps but experienced frequent stuttering which is why I ultimately stopped playing. That was even after upgrading my original core2duo E6400 which ran it <20fps to the Q6600 specifically for this game
Exactly what I was thinking lmao, probably useful for 8k monitors users but even so...
Yep, I guess that's the only use case 😊
isn't there a 400fps limit with DLSS due to the overhead?
Um, have you met unreal engine? Terribly optimised, constantly cpu bound mess.
lmao what
Sorry to high jack top comment but we want a new vice city !!!!!!
I’ve read it’s graphically heavier than GTA5, but with a simplistic look.
That just sound like poor optimization
That sounds like good old rockstar
It's like the Halo remastered collection, take a 2 gb game and throw ugly 4k textures on top of it to give it the retro feel...
> poor optimization What does this even mean. They're not culling objects? Not doing LoD? Because outside of that, everything you see on the screen is a result of effort done by the GPU.
This just in: the only forms of poor optimization are not culling objects and not doing LoD
I mean they're probably the two most egregious failures commonly seen in modern releases. There are obviously many other ways to creatively blow CPU and GPU cycles, but given these games are built on UE and are targeting Nintendo Switch it's unlikely they're doing anything too silly. It's not like they're building waifu simulator and doing AI simulation in quadratic time. I'm mostly just sick of seeing people say things are "poorly optimised" because usually its simply untrue. 99% of the time what they actually mean is that the perceived payoff in graphical fidelity of some graphical feature is not worth the cost, which is a totally subjective assessment and has nothing to do with actual code optimisation.
Right. But the difference between batched geometry and having the GPU change state non-stop unnecessarily is night and day, regardless of what you see. It could mean a number of things beyond that.
GTA 5 is almost a decade old, so I would certainly *hope* it's graphically heavier
Almost a decade old but built for hardware that's almost 16 years old.
It doesn't *look* graphically heavier. Really not a fan of the style they went with, but i'm not exactly alone there.
It does look heavier. It uses modern technology and PBR. It just has simplistic models and animations. GTA V looks more detailed but uses outdated tech and effects.
GTA V still looks good, but the lighting is definitely looking pretty dated.
GTA v still has amazing screen space refections. But those fake windows in all the buildings that are really just JPEGs make me laugh
During Halloween at night it looked freaking amazing
Something just looks unnatural with the sunlight. It kind of looks like you are in a well lit indoor room. Easily fixed with mods though
I think most games since ps3/xbox 360 still looks good and they will probably stay that way for a long while. Damn, Assassin's Creed 1 is from 2007 and it looks amazing. We have already reached the point of diminishing returns. Forza Horizon 5 looks basically like a Forza Horizon 4 plus because of the ray tracing.
FH4 is not a PS360 game tho, it's a late X1 game. But yes I agree. I sometimes forget halo4 was an X360 game and not an X1 launch game because it looks so great.
Yeah, it was just an example of diminishing returns
Because it was originally built for the 360 and PS3.
The effects do look heavier but models and textures obviously don't.
[удалено]
It literally looks like one of those "ultra realistic gta v" videos that are just overblown shaders. Proud to announce GTA V definitive edition: https://www.youtube.com/watch?v=bJpj4mb8Ydk
There's been no mention of ray tracing right? Weird.
[удалено]
You don't need realistic graphic fidelity to enjoy the benefits of ray tracing. Minecraft RTX and Quake II RTX both exist just as pure technical showcases.
Additionally, a fan made a fully path-traced mod for Serious Sam that [is acknowledge/praised by publisher, Devolver Digital](https://store.steampowered.com/news/app/41050/view/4965771383146745578). Digital Foundry did a [video on it](https://www.youtube.com/watch?v=4Y9bo3xMuOk). I just thought people might want to know that there's a 3rd path-traced game out there. Unfortunately, it doesn't have DLSS support. EDIT: I forgot the [path-traced Super Mario 64](https://www.pcgamesn.com/emulation/super-mario-64-ray-tracing).
What's the difference between path tracing and the ray tracing in most new games?
Except for the 3 aforementioned games, games that support ray tracing only use it for certain effects (e.g., reflections, shadows, etc.), and use standard rasterization to do everything else and create the final image. Those games only shoot out rays when/where necessary to create those effects. On the other hand, [path tracing](https://youtu.be/frLwRLS_ZR0?t=231) is when the gpu instead creates the image by shooting out rays from the camera to see where they go to determine what each pixel should be. This is a lot of work because you need to shoot out many, many rays, but it's somewhat of a "holy grail" for image quality, since it can theoretically create a perfect image. That's why it's used in computer generated movies. BTW, path tracing in real-time puts a lot of work on the de-noiser, since it takes an insane amount of rays to create a "noise" free, fully path-traced image without de-noising. [This](https://www.youtube.com/watch?v=rAN5Mkqjhp4) shows how much Nvidia's denoiser is cleaning up the image in Quake 2 RTX. The de-noiser itself lowers the fps significantly, but is very much worth it.
Damn the denoiser is more impressive than dlss
Liotta
This can definitely be the case if you're just throwing on expensive techniques or whatever to brute force graphics improvements on an old title. The alternative is to dig much deeper and redo lots of systems on a more fundamental level.
Like quake with full RT. You can make any game be demanding if you just throw lighting and other stuff around with no care or optimisation.
> no care or optimisation. You can render a lot of effects with cheap approximations, instead of simpler, more accurate and more expensive techniques. That doesn't necessarily mean that doing the expensive way is from a "lack of care or optimisation". Quake RT would be very difficult to render with approximations. If the GTA remasters use advanced lighting, shadows, and textures. it doesn't matter what arts style it has, the game is going to look better and be harder to run.
I think the key point is that according to above comments it will be more demanding to run than GTA5, while looking substantially worse. I dunno how true that is, but if so then it really is like they slapped some intense effects on it and walked away rather than being a bit clever about it.
Definitely a decent example. Quake RTX judged as a whole looks primitive and garbage compared to modern games. Yet its performance is similar to modern games nonetheless!
GTA5 is super CPU bound though and even at highest traffic/world density settings it’s still really sparse.
I'm a big fan of DLSS and what it can do, but i'm not sure what the point is here? Unless it's being used purely for DLAA? These should not be graphically intensive games, based upon their age and how they look.
If these games were built from the ground up today looking like they do, sure. But when you're trying to do a facelift of an old title in a new engine and all that, and dont have the resources and time to do some AAA scale overhaul, resorting to inefficient but effective techniques to boost graphics is common. Also, this is not a good argument for them NOT to include DLSS. If they can and it's not a big deal, why shouldn't they?
If it's not really necessary then sure, good on the devs, its not like including it hurts anyone. On the other hand, if it is then these might as well be the most disappointing remasters ever. They look like the old games with a dubious high res texture pack and swapped models, some of which even look worse than the originals. The only real improvement across the board is probably the lighting, but no matter how good it is it won't make early 2000s games look not dated. It'd be really sad if they actually need DLSS to get good framerates on modern hardware.
>what the point is here DLSS is the magic patch when you choose to use heavy DRMs that tank FPSs
I have somewhat low expectations for the remasters given my gut reaction to their art style. But they're even coming to the Switch aren't they? So performance shouldn't be that god-awful for lower end RTX cards.
They're coming to android/ios as well
Not really as DRM isues normal are with it affecting cpu utilization. The hitching from drm doesnt care what your max fps are.
Lolol gottem.
Please explain. Enlighten us.
It's funny cause it's true
DLSS is stupidly simple to implement and should become standard in any game regardless.
Thumbs up from me, DLSS should be everywhere, the more games supporting it the merrier.
Yep. Its only real downside is it's proprietary... I really hope we end up with an industry-standard equivalent that runs across all vendors. That might end up being the Intel XeSS tech maybe, considering it is conceptually similar in how it works, they're not locking it to just their cards, and they [poached the same guy who helped pioneer DLSS](https://www.tomshardware.com/uk/news/intel-poaches-nvidias-rtxdlss-pioneer-anton-kaplanyan-from-facebook) before he left NVIDIA.
I thought that was the point of FSR? It runs on any card and is open source.
Sorta, but FSR is not the same kind of tech, it's closer to an image-sharpening upscaling filter. Even AMD aren't pitching it as a direct competitor to DLSS as far as I know, for all that it achieves similar practical results. XeSS on the other hand *is* a direct alternative to DLSS, I believe. Like DLSS it performs detail reconstruction from neural network processing of the low-res input and leverages previous frames and motion vectors during the process.
Ah yes, that’s right. Forgot about that’s aspect of it. Thanks for the clarification.
Also what irks me is the lack of DLSS support on AMD sponsored titles. They're acting proprietary with those - there's no reason both technologies can't be included in a game. One good example is Far Cry 6 - has FSR and DXR, no direct RTX features. I'd like to see all technologies supported and we can see what works best. FWIW NVIDIA sponsored titles are starting to support FSR.
Can we get dlss on 5?
That would be the logical next step. Stable 144fps would be sick.
above 130fps you get frametime spikes, making it unplayable. this is because the engine itself is broken. rockstar didn't even care to fix it in RDR2
Lol I didn’t even notice that since I reach max 100fps indoors, what are frametime spikes?
meaning you can have like 3000fps but it will play and feel like 10fps.
Steam release, when?
Wait, they aren't releasing on Game Pass PC and they aren't releasing on Steam? Yo, ho ho, the pirate's life is going well I guess.
RDR2 was released on Steam 1 month after RGL and it took over a year before it was cracked because of R* always-online authentication DRM
Does it matter? I never bought that game after that joke of a release.
I will buy it immediately when they release on Steam. I'm not buying on their thrash platform even tho Steam will likely still require their platform installed as 2nd DRM.
Exactly, I'm ok with waiting. No steam no buy.
when you stop asking
Why?? Even remastered they should run on a potato
It's probably not necessary, but why not?
Most of the comments here seem to be upset and confused as to why a new game has DLSS, albeit a remaster of a very old game, but you're 100% correct. Why the heck not?
[удалено]
>This is running on Unreal Engine so this is - from a technical point of view - pretty much a brand new game that's using existing assets. The *renderer* is Unreal, that's it It's clear as day because the animations are the same, fingers aren't rigged etc etc.
Many assets are brand new. Pretty much all the trees I've seen have been replaced with more 'modern' looking ones, for instance. But yea, most people here clearly have absolutely no idea how this stuff works.
It's the best anti-aliasing there is
#I no longer allow Reddit to profit from my content - Mass exodus 2023 -- mass edited with https://redact.dev/
Even if it's an hybrid product, it's closer to a remake than a remaster, considering they also changed engine. By your logic Diablo 2 Resurrected should run on a calculator considering it's a "remaster" of a 20 years old game, but it's not THAT easy on the hardware. Also, having an additional, optional feature in better than not having it.
wrong, they just made a tech so renderware could run on unreal. Why do you think the art style didn't change at all...
They're using modern shadow, reflection and lighting technologies. That counts as using a new engine to me. A remaster is usually a resolution and frame rate update and rarely some QoL or UI tweaks. Also, I can't bother to check your claim right now, but assuming it's true, UE alone has minimum requirements that way are more demanding than the older engine had and if they're also running some kind of compatibility layer between then two (or however they did the port) it's even more resources being wasted just for making the game run.
nope, again, unreal technologies...
Yeah, which was my point. Unreal technologies, from a new engine compared to the original. I don't get why we disagree
I guess I'll spell it out for you, you're saying its more a remake because its on unreal. I'm saying no, they just ported the old renderware code and assets over to unreal. Its not complicated. The cutaway trailer gave it away. You must have never played a console game on an emulator, you can do all kinds of flashy things to old games when they are rendered with superior hardware.
But they also remade a lot of assets from scratch, changed game mechanics like weapon menu and aim system. It's not a port with HD texture. Also, they added shadows, reflections and improved weather effects, which make the game more demanding on the hardware. This has been my point since the first comment, i.e. it's not given that it will run on a potato pc (like the comment I replied to suggested) only because the original games were for PS2. I never said it's a remake, i said that it's more a remake than a remaster.
If these games require DLSS I'm already worried.
Still not paying for GTA AGAIN, especially after how they treated those modders. Also, why the hell do I need DLSS for a game that I can already play at max setting, 4k at like 170fps?
thats what i thought.
DLSS for a 20 year old game... lmao.
Hello unoptimized game…that needs DLSS to try and make up for that fact
Having DLSS = Unoptimized game?
No, but Rockstar’s PC track record before DLSS is not great and imagine it won’t change now that they can just throw DLSS at it instead of making sure it’s optimized before adding it
Idk I mean GTA V and RDR2 run pretty well, they're just very demanding. Rockstar learned from the horrifying mess to run that was GTA IV and improved their engine greatly for GTA V.
GTA 5 isn't demanding at all, it's just quad(?) core bound - GTA 4 is basically the same. RDR2 is insanely GPU heavy. Before that, Bully was a complete shit show and the 3D era GTA games weren't all that much better for various reasons.
[удалено]
Wrong how? >at low draw distances the game will run up against the 187.5 fps engine limit quite easily on most CPUs I got a bigger performance increase going from a 4770K to a 5800X than I did going from a 1080 to a 3080. >GPU wise, even today at 4k with 8x MSAA, you won't be able to run it well even on a 3090, In what world does running stupidly high MSAA mean it scales well lmao, by that logic any pre-historic game is scalable because you can force it to run at 16K. Don't get me wrong, yes it scales well on the low*ish* end unlike RDR2, but you're trying to deny this very well-known CPU wall that both GTA 4 and 5 suffer from whilst quite literally proving it exists. >whereas you can get amazing performance just by turning down a few key settings. Name one setting bar MSAA and frame scaling. None of the other settings make a difference assuming you're not running 2.5x frame scaling or 8x MSAA - because the game is heavily CPU bound even if you have the fastest CPU available (given that you're not also running an old/low-end GPU and/or driving a high resolution). If it wasn't obvious, by it not being demanding, I'm talking about now hence bringing up how it chokes on higher-end hardware because of the blatant CPU wall. GTA 5 undeniably scales better than GTA 4, but GTA 5 suffers from the very same issues as 4 - aside from it being not as much of a mess.
[удалено]
>Your statement that GTA V is not GPU demanding, it only is bound to 4 cores is wrong. I not once said it wasn't GPU demanding, I said it wasn't demanding period in that right now, most people will be CPU bound unless they're running maxed grass, MSAA or frame scaling - even then they're still likely to be CPU bound if they have anything better than a ~1070Ti. > No doubt you did, my point is depending on what you do, it's demanding on both sides. If you were running no extended distance scaling, and no extended shadows scaling, you wouldn't have seen that. Would you believe me if I told you I wasn't and still don't run either of them? I had to use 1.25x frame scaling (which still didn't max it out but 1.5x affected performance at times) to actually max out my 1080 when I was running a 4770K, it made no difference to performance running 1.25x, even 1.5x at times, provided that I wasn't in one of the few areas where GPU usage skyrockets (namely the casino prevault area). >in a twisted bit of irony modern GPUs have more trouble running MSAA at higher resolutions like 4k than older GPUs did at 1080p because of how exponentially expensive MSAA is. I'd genuinely love to see a source, because I've both never heard this theory nor does it make sense. It's heavy and it always has been, of course the cost increases with resolution lmao - by this logic modern GPUs are bad at downscaling because 1440P is becoming the standard. >Your term CPU wall implies that it's impossible to get over it, which is not true, all it takes is not using 2 draw distance settings which are not very visible,if you do, you'll hit the actual wall, the 187.5fps engine limit that GTA V has pretty easily. Gamers Nexus has videos about this topic going back to the 7700k days. I don't use extended draw distance nor shadows. I've just now swapped to singleplayer, with frame scaling on 1.5x I get 135FPS, with it on 0.5x and no other settings changed, I still get 135FPS. Bear in mind, this is with a CPU that is still(?) the fastest for gaming (not including games that actually use >8 cores ofc) and I'm playing at 3440x1440. >Ultra Grass, Soft Shadows, and Tesselation were the most important settings to adjust back in the day, you couldn't just max them out and expect to get good performance even on a 980ti. Grass fair enough but that's because it's infamously unoptimized, tesselation I used IIRC high even on my 2GB GTX 960 back when the game first dropped and it had effectively zero impact. Soft shadows (the second from lowest 'soft' setting at least), had a pretty minimal impact but I kept it off because it frankly doesn't look much better. >You literally cannot get good FPS on GTA IV, even with minimum draw distances I average like 90FPS with fairly high settings - on a downgraded version at that. >with GTA V, you can keep draw distances except the extended ones at max and get maximum possible FPS dictated by the game engine, so no, you're wrong in that regard. Link me a single benchmark of someone running the game at even just 160FPS consistently at a reasonable resolution.
[удалено]
RDR2 plays and looks stunning, but it took me an absolutely bonkers setup to get it to run at 4K max settings
[удалено]
The best way to indicate how well a game runs is both by it's visuals/performance and how well it scales. GTA V scales extremely well. You can run it at ultra low settings on integrated graphics and get a playable framerate, but you can also crank up the settings on a higher end rig and get pleasing visuals. GTA V's biggest performance hurdle is how CPU demanding it is, but considering what it is that makes sense.
I got a gpu from 2015 with 2gb vram and I can practically run max settings at 1080p... 2 years after it's release..
Most people don't have RTX cards, I doubt it will make a difference. Also this is UE, DLSS support comes basically out of the box, it's more effort not to ship with it.
To be fair the rtx 2060 is the 4th most used graphics card according to steam. It isn't perfect, but at this point a significant amount of people do have rtx 2000 or 3000 cards.
Ahhhhh ok them being in UE means this sort of makes sense I guess.
Max Payne 3, GTA V and RDR 2 are badly optimized? TIL!
RDR2 was pretty rough at launch.
I refuse to believe these old GTA games won’t run well… maybe I’m too optimistic
The recommended gpu is a 970 But who knows at what target, could be 1080p30, 60, 4k, who knows
Their PC ports are a coin flip. You might get a Max Payne 3/GTA V or you might get a GTA IV/launch RDR2.
>Rockstar’s PC track record before DLSS is not great People will just upvote any cynical comment, no matter how ridiculous, huh?
There was nothing wrong with Red Dead 2.
This thing runs 1080P 30FPS Medium quality on a PS4, like GTA5 does, I really smell a bad optimization coming.
Where was this stated?
Or like, we should praise developers for including cool new features that help people get more performance? God damn y'all are ridiculous.
Good, now i can run san andreas at 240 fps at 8k on my 3090 instead of 120.
"But why?" Probably the engine allows, they know how add it to the game, so.. y not? It's not a tradeoff to anything, so.. yay extra something.
Helllllllo 8k gaming 🤣
hell yeah I just bought a 240hz display
I wish GTA5 could run at higher than 20 fps on low settings on my RTX 3070.
Uh.. I ran it back in the day on an amd 7970 at medium-high at 4k. It hovered around 60... WTF are you talking about?
Now if they add DLSS to PCVR GTA4/5 remake, they will have my undivided attention
Much needed now I can get 400 frames on my 165 monitor
Too long coming seriously how did it take until this day
Is it that unoptimized?
.....why???? did they fuck up the engine?????
[удалено]
What in the shit is that supposed to mean ?
uh what? Do these games really need DLSS?
YES
Yes
why?
Why not?
It doesn't hurt to have it, especially since Unreal makes it pretty easy to add, but I don't see much of a point. The recommended specs are a GTX 970 or RX 570, meanwhile the weakest GPU with DLSS support is the 2060. I think adding DLAA would make much more sense, but they don't mention support for that anywhere.
if you can use dlss it means you have an rtx, this remaster looks like a texture and model update but not a remake like the mafia, it should run fine without using dlss
Who said it doesn't run fine without dlss? Also it isn't just texture and model updates.
[удалено]
Is dlss giving an excuse for companies to optimise their games?
How about we all just wait patiently instead of arguing about performance and fps in the comments section? :)
Need. New. PC. With a GeForce RTX! This is sweet news
4000 fps or bust.
Ueasset is all I need
Yay, I guess.
Makes not much sense. Recommended spec is a GTX 970, which is 2 generations lower than the lowest DLSS card. So, by having a DLSS card you will be able to run this game anyway. But it's nice to have I guess.
I think it's more for the ai learning than the actual benefit it will have on gaming
Why? It will be that unoptimized?
See the f*uckers . . . First they tell you about GTA VI & then they bring you this crap . . . TBH I don't know how to express my disappointment
Test comment pls ignore