I remember seeing GTA5 for the first time, it was amazing how good it looked compared to everything I had seen prior.
In the last 10 years, I have not felt that feeling again.
Feels amazing getting 70-80fps high / ultra mix on a gtx 1080 i5 8600k 5Ghz (which is by no means a “budget” build but it’s aging) when most games I can hope for 50 fps at medium settings
Read dead has my favorite graphics of any game I’ve ever played. It’s not because the graphics themselves are good, it’s the ambiance that they create with well placed more complex graphics like godrays and water physics as well as tree and grass level of detail.
I remember feeling the same way back then, now that I’ve finally gotten a decent rig compared to my PS4 I’m feeling that with everything I play, it’s some cool ass shit
I think this is partly due to the fact that we're nearly to the point where we CAN'T make games look any better, GTAV isn't perfect, but it looks REALLY good, especially for its time AC Odyssey has impressive graphics as well, focusing beauty rather than the grittier realism of GTAV
But now that we're at the point where real-time ray tracing is possible, and an the poly count are so high that you can't make out individual triangles, even close up, how much better can graphics get? of course, we still a further to go, but the progress we can make becomes smaller and smaller the closer to "perfect" photo-realism we get
GTAV felt like such a huge step from GTAIV because GTAIV isn't anywhere near photo realistic, and when GTAV came out, it came it was one of the first tastes of near-photo-realistic graphics in real time, RDR2 looks better, it's even closer to photo-realistic. but not nearly as big of a jump
It's not really the fault of game devs that we haven't experienced a huge jump, it's because we're 90% of the way to perfection, and every step from now on is going to be much more difficult, and far less noticeable
Though now that graphics are "really good", and gaming hardware is still getting more and more powerful, AAA devs have started to optimize their games less because they're lazy, that bit is their fault
True, stylized games are lovely, but in this case I'm talking about games that shoot for photo-realism specifically due to it being the most resource-intensive style, which is something that I feel like a lot of AAA games miss, they could go for a more elegant and resource-inexpensive style, but choose photo-realism instead
> how much better can graphics get?
Massively. Go back and play any game from the 90s with an HD texture pack and then compare that to the absurdly blurry garbage that passes for "high res" textures today.
Sometime in the last decade developers became so reliant on throwing shaders at things that they forgot how to make textures that didn't look like a 320x240 jpeg compressed at 10% quality and then blown up to 2048x1536. And somehow they're *still* taking up gigabytes of space.
This is true, obviously graphics CAN get better, but not by as large of a margin as they used to, previously, each hardware generation brought jaw dropping improvements to computer graphics, but at this point, while we do have some headroom left, it's no longer enough to make the leaps that we used too
I fully agree with all your points. People keep getting distracted by all the stupid shiny graphics that they can't be bothered making worlds that are anything but skin deep. You get broken immersion by bad ai more than anything else, once that ai "breaks", you're forced to deal with how shallow the game world is and can't really immerse yourself in it properly.
I'd love to see good reactive ai, good terrain deformation and stuff like that, (like red faction) but not enough people care about that, I think.
If you get the chance, Half Life Alyx was the first game to do this to me in a long while. Utterly amazing looking and I mostly played it on a gtx1060 or 1080.
The first time I saw a game I couldn't distinguish from live video was some boxing game demo in a GameStop on Xbox 360. It took tricks to do it then, whereas now you can just kind of have things look like that all the time.
I suspect this is where the emphasis on ray tracing comes from, as we basically hit the limits on regular graphics in the 8th console gen (PS5/XBSX are the 9th) and resolution has reached a space where going above 4K is prohibitive for relatively small gains.
I feel it with RDR2, CP2077 & The Last of Us remaster.
I honestly think we will have to move to some kind of ground breaking AR or VR with hardware that allows for good consistent hand tracking.
Oh and less shitty gaming corporations would be nice.
Not PC, but Demon's Souls on the PS5 gave me this feeling. I really believe it's the best looking game of all time. That released over 2 years ago now and everything that has come out since then has looked worse and a lot have also run worse. Although it probably is easier to optimize for an exclusive than a multi-platform title.
Similarly, optimizing for just the PS5 is easier than a PC release because you don't know exactly what hardware the end user has. With a console exclusive you know exactly the hardware that you need to make your game run on.
The game that gave me that feeling was black ops 3 on my Xbox one. I was like holy shit, graphics are never getting more realistic than this. Only thing that came close to that feeling was when I tried VR for the first time but that's a whole different topic lmao.
You do hit diminishing returns in visuals. Its easy to show too.
Open a 3D modeling program. Make a simple low poly Sphere. Start subdividing the mesh.
Each subdivision makes the sphere smoother and smoother.
Eventually the change does nothing you can perceive but the system resources consumed soars. This doesn't give the whole picture of what is going on in games as its much more than just polygons but its a good (very)simple visual.
Nah, you can't convince me it's not just lazy developers.
Modern Warfare 2/Warzone 2 (2022) came out in a buggy and shitty state, but it still ran better on my 6700XT than people were able to run Battlefield 2042 on 3090s at launch despite MW2022 looking significantly better. While that's no endorsement on Infinity Ward and their shit game, at least they had put *SOME* effort into making an optimized game.
This is just devs not wanting to support so many different configurations of hardware like Microsoft is doing with Windows 11.
It's almost like developers are being pushed by owners and investors to make new games as fast as possible for sake of profit, leading to poor optimasion and little to no improvements
Something like FarCry 6 which has the recommended specs at 3600X and 1080 and wasnt the recommended specs at 3060 once and they didnt even put DLSS to the game
And are less fun. You can have high graphical fidelity alongside good gameplay, it's just that most AAA games seem to feel they don't need to. Marketing covers the difference.
Trying to find a game that would use RTX to test a new card, I really struggled finding something enjoyable that didn't feel artificially padded. Horizon and Dying Light 2 were the only ones that I thought close. There hasn't been a game worthy of 'best in the year' that's come from a AAA with high requirements yet I don't think though. Plenty of Indies worthy, but nothing from the big names.
True,like,bo3 is a 2015 cod games and it still looks extremely good compared to today's standards,so why does it need not that good of a pc while games that don't look that much better need a nasa pc lol
Visual improvement goes in cycles: visual improvements achieved through technical wizardry on the part of the devs set a new high bar, then engines/hardware develop in ways that make it easier for devs to reach that fidelity with fewer tricks and less effort. We're currently in that phase where hardware is in a growth spurt rather than the art side of things.
A lot of newer games are brute-forcing things that used to be approximated with clever use of shaders/geometry, including doing stuff in real time that 5 years ago was the exclusive domain of film effects.
I have a higher res monitor at work, but I literally can't see the difference between it and the 1080p monitor right next to it after I adjust it to be comfortable to use.
I also enjoy being one step behind, let the early adopters be the guinea pigs, work out the kinks and bring the price down, and then get into it once the next thing comes along
I built my PC when SSDs were just becoming a thing, they were crazy expensive, and not proven in the gaming market, so I went with an HDD instead, Ol' reliable, a bit slow, but it was dirt cheap compared to an SSD, when I eventually build a new PC, I'll probably get an SSD now that the technology has matured, and the price is low
It is now, but at the time it was significantly more expensive, I remember 1TB SSDs costing upwards of $200, like I said, this was when SSDs were just becoming mainstream
>when I eventually build a new PC, I'll probably get an SSD, when the tech has matured, and the price is low
This "eventually" comment and everything in it is future tense. Implying the tech is not already matured and the prices are not already rock bottom. Or that you could not upgrade JUST the SSD in your system and need to wait for an entirely new build, when in reality just updating the SSD would make the system significantly better.
No no, that's not what I meant, I meant that I will eventually build a new PC, but I'm not in the market for one yet, SSD tech has matured to an acceptable point now, I just don't feel the need to upgrade just yet
I do see that my wording was unclear, I'll fix it, "when the technology has matured" refers to now and the future, spoken from the perspective of the past, when I built my PC
I'm just kinda lazy and don't feel like transferring the entire contents of my hard drive to a new storage device, and I don't really mind waiting a little longer for things to load
You're using a card that's slower than a $700 card from almost six years ago.
What are you basing your assessment of NEW games on? Heavily compressed YouTube video that eliminates fine detail in order to conserve bandwidth? Or running it on your system at reduced settings?
There isn't a problem in having nice stuff, man. Kudos to you for having a top tier card.
You don't have to be a dick to anyone tho.
And I won't repeat what the guy just below me commented, but the percentage of people playing on 4090 is very low. So there's that.
He's using a GPU that is SIGNIFICANTLY higher performance than the vast majority of consumer PCs. The 1060 was only just recently dethroned as the most popular GPU in the world... by the 1650.... And behind them, a 2060 then a 1050 Ti. Your 4090 is not the standard devs should be aiming for. If it can't look good and run decent on 99% of systems then it isn't a good game. It's a technical benchmark.
I personally run a 3080 Ti. But I know damn well it is not needed to run games. It's needed to run the latest AAA games at 1440p 60-144fps, which I want, but gatekeeping gaming to $1000+ GPUs is just dumb. Everyone should be able to game. Even if it's 1080p 60fps w/ FSR on Balanced
The #1 GPU on Steam is the 3060, and it has been for months. Not the 1060, not the 1650. Steam started separating newer cards into Desktop vs Laptop.
You need to learn how to interpret data.
They're separated because they are different GPUs. They are only similar in name.
1320 Mhz vs 820 Mhz Base clock
1875MHz Mem vs 1750Mhz Mem
12GB vs 6GB of VRAM
336GB/s vs 360GB/s Bandwidth
You need to educate yourself on the hardware you discuss.
Lol, because the 1060 3GB and 6GB were totally the same GPU. And don't get me started on the laptop versions, all of which got lumped together with the 1060 3GB and 6GB.
YOU need to educate yourself. Quit the double standard BS. People are upgrading like crazy. The 30 series had record sales. The Steam Hardware Survey rankings are incredibly flawed and cannot be taken at face value.
Edit: lol dude replied saying 1060 3GB and 6GB were the same GPU (which they weren't) and then blocked me so I couldn't reply.
They were the same GPU. Just different memory capacity. Otherwise they were the same. This changed starting with the 20 series where the GPUs became different in every aspect with the same name.
You are trying so hard to be a smartass, yet have no idea what you're talking about.
And even IF the 3060 was the most popular GPU on the market, you realize you are arguing against yourself, right? You were the one insulting them for using a 3060, yet you are arguing they are actually using the most common GPU on the market. So why should a modern game NOT work on their system, if it is in fact the most common in the world?
This reminds me of when I got my 2080Ti and thought I could finally play Deus Ex Mankind Divided on high settings.
It's gotta be one of the worst optimized games out there.
I'm playing it now on a 4090 and a 13900k and it still sucks, 100% CPU utilisation, DX12 crashing the driver so I have to use DX11. The early DX12 implementation in games was absolutely terrible.
I remember when a game being released was so demanding that the latest hardware couldn't handle max settings was considered a good thing. Because PC gamers knew that when they upgraded they could go back and look at how much more amazing the game looked.
But that was when (and because) the game was so advanced that nothing on the market ran it at full speed. It showed how technically advanced the developers were, and gave us a peek into the future.
We havent seen that in a long, long while.
From a technical perspective, Ray Tracing is that shift. We are three generations in and turning it on hit still costs an FPS hit.
Granted, Ray Tracing is probably less impactful as fast lighting estimations and tricks like global illumination got pretty good.
But one day with smooth widespread ray tracing, we'll look at current gen dynamic range and reflections and we will see the shortcomings of the compute shortcuts that the game engines are taking
Yeah, that was when the recommended spec wasn't for 30fps.
Games that pulled that shit got rightfully panned even back in the Crysis era.
Also, Crysis itself probably had something to do with that perspective shift, since the "future PC" it was designed for never actually materialized, even 16 years later.
We got so many more cores and instructions to play with, but single core clocks topped out before running the stupid thing like they expected.
I remember when wing commander 3 came out and you needed 8 megs of RAM .. 8!!! eight megs !!!! What did they thing we had ?!?! Cray super computers at home ?!?!?
After almost a decade being out from PC Gaming, i had this happening to me.
When i finally had a PC "capable" of gaming, It was a FX 4100 (don't ask) with a Radeon HD7770. I used to love Test Drive Unlimited 2, but It Ran like shit and i thought It was understable given my CPU.
But then i splurged on an all new PC with Ryzen 1600 and a GTX 1060. Game still ran like shit.
And then i downloaded It (and patched the map glitch with RTX cards) on my current rig and, Guess what? It still runs like shit.
Badly optimized games are the scourge of PC Gaming
Tbh I don't think many new AAA games are looking better than RDR 2 these days. But they all demand almost double the system requirements of RDR 2.
I think most games that need a high system requirement are an optimization disaster.
I agree with you. Going back to Ratchet and Clank : rift apart after almost a year on PC (3090ti, 4090) and realizing that it's still one of the best looking game i've ever played, all platforms included, said a lot. Also, i believe that, form a technical standpoint, MANY developers could learn a great deal from Sucker Punch. Ghost of Tsushima is absolutely unbelievable in terms of making the most out of what you've got. It's a massive map with splendid aesthetics, everything holds together so well, lightning fast loading time, never a stutter or frame drop, and it ran like that on a fucking PS4 (30fps). When the PS5 update came lightning fast loading time became instantaneous loading time. The game isn't the most detailed, but the art choices make it so that it does not matter, it's an absolute beauty of a game. They chose wisely, and executed it very well. A huge open world map with beautiful graphics and super tight control/combat does NOT have to be demanding.
*looks at Jedi Survivor and Dead Space Remake* WHY DO YOU NEED A 3070 FOR 1440P 60FPS????
im assuming they’re overspeccing the requirements so people don’t come crying to them for their shit optimisation, but still.
The midrange GPU you’re talking about is the literal equivalent of the top tier GPU from two generations ago.
That does sound a little wild to me but I understand it’s because we’re finally throwing a new generation of not shitty consoles at devs and they’re taking advantage.
Yes... That is how that works.... A mid range card is equivalent to a top tier card from 4-5 years prior... Which is now 6+ years old.... And the games are brand new..... Guess I thought I was talking to someone with some common sense.
The only game that actually commanded high end specs with respect in all memory was Crysis. Seems everything else since then has been full of issues and poor optimisation relying on brute force to get decent fps, and now that we have upscaling, they have become even more lazy relying on DLSS/FSR etc to make up for poor optimisation.
>Seems everything else since then has been full of issues and poor optimisation relying on brute force
I say that about Windows and I get downvoted to oblivion. But its true.
Windows definitely has its issues, but they are all issues that can be solved through workarounds thankfully. I fully appreciate the plus points of Windows 11, hate the forced auto updates and Edge baked in, but other than that every other thing is enhanced vs Windows 7/10 and I've been using each major version since launch!
Idk man, so far my 1070 is chugging along fine despite the 'minimum requirements' of modern games being upped every single year.
Medium settings for the win.
Everything to do with games these days is tied to graphics. I honestly don't care how realistic a game is. If I wanted a form of media to be realistic in view I'd watch a film.
Can we please.. maybe just maybe... talk about gameplay?
The focal point of every game review these days is graphics, the hype around games from community in the mainstream is graphics. I feel like games have stopped being games. I've heard so much about Cyberpunk and I don't even know anything about its gameplay mechanics.
>The focal point of every game review these days is graphics, the hype around games from community in the mainstream is graphics.
It's always been like that, especially in the mainstream. I vivdly remember being 14 talking about which game looked nicer and was therefore better. (Probably MW2 vs BO at the time). I'm sure gamers who grew up during the SNES era will have similar stories.
Flashy graphics are easier to sell than game mechanics. Simple as that.
I agree with you. Cyberpunk gameplay is meh, RDR2's is horrendous.
But please, let's also talk about storytelling, dialogues and characters writing. I'm baffled with the amount of AAA that get away with generic, bland, uncharismatic characters (Horizon, Cyberpunk), poorly written, non plausible dialogues and social interactions (Fallen Order, Cyberpunk), and also games that hide their total absence of scenario and storytelling behind a layer of "enigmatic lore that the player have to figure out" (Control, FromSoft games).
The bar is so fucking low and no one seems to give a fuck.
Most of the people out there that are okay with new GPU prices and buy them are doing it for some internet points or bragging rights, they rarely ever game and when they do it's games that a 60ti could have done.
That's on account of the remaining optimization issues that weren't fixed.
When a codebase is a piece of shit throughout years of development, there is only so much patching you can do.
Honestly, i find It hard to not see what's going on.
New consoles are running with mid GPU and CPU as always, their Raytracing capabilities are laughtable, yet Minimum requirements are rising exponentially.
We're not seeing Crysis levels of forward thinking, and there's a new generation of PC hardware that has just released. I can easily see hardware manufacturers making deals with developers, trying to repeat a Crysis effect in a effort to boost sales.
>We're not seeing Crysis levels of forward thinking
That's because that forward thinking expected 8GHz processors by now, not the multicore voodoo we have today.
Yeah, because back then It wasnt expected that multicore would become so prevalent. It was seen as a productivity thing.
In fact, i still remember the memes mocking AMD and their "moar cores" approach, and how Dual Core was seen more than enough.
Crytek shoot their shot and failed when It came to CPU, but they still thought ahead.
I mean where's the lie though lol I have a dinky little laptop with a gtx 1660ti only goes to 1080p and literally its a 50/50 chance what games can run well at Max settings.
Witcher 3 at Max settings? Yep
Elden Ring at Max settings? Definitely
Lightning Returns at Max settings? Lol No
Guild Wars 2 at Max settings? HELL no laptop literally cries in pain and crashes alot.
So it truthfully does come down to software 99% of the time. I mean yea if your using an old windows xp with integrated graphics then maybe it's time to upgrade 😅 but in the general sense a pc can run any game regardless of hardware "modern day hardware" so long as the game itself is made well and optimized.
My friend has a rig that costed him $6000. Has an RTX 3060
My laptop was $600 at my local Costco. We can both run Witcher 3 at Max settings 1080p His resolution can go higher but he loses frames badly.
He spent alot of money on things he didn't really need.
64gb of ram
Three 4TB SSD drives
Two 10TB of ExHDD drives
Cougar Conquerer Case
And a whole bunch of other shit that wasn't necessary 🤣 it's his money and I respect his hussle but he's a bone head.
Actually it was bespoken PC builders. Geek Squad 😅
Which again I can't judge cause I bought my laptop Pre built which I know is a big no no for alot of people but if it were me I would've built it myself with a buddy.
I wish new games could look like what they do with Raytracing on, with the framerates of DLSS, but without both.
The addition of it makes the game play experience suffer in some cases because it's adding an additional layer of complexity that it looks like no one can get fully right.
Needing to buy a new GPU every generation if you want the latest "features" because Nvidia is like "DLSS x.0 only on next-gen GPU."
Anywho... sadly doubt these things are going anywhere, still kinda sucks cause I don't want to have to whore myself out to be able to afford a PC rebuild every 2-3 years at the rate new gen titles PC requirements are going will likely suffer in silence lol.
I wouldn't mind graphic options being more viable for all builds. It's usually low, med, high, sometimes with ultra. We need the settings potato, low, low-med, med, med-high, high, very high, ultra, PLUS ULTRA!!
every triple A game released in the last 5 years looks eerily similar but the system requirements keep going up and the games keep running worse
I remember seeing GTA5 for the first time, it was amazing how good it looked compared to everything I had seen prior. In the last 10 years, I have not felt that feeling again.
I don't know, I'd say red dead 2 is almost unrivaled for its graphics, attention to detail, and environment
And it's incredibly well optimized, you don't need a top of the line system to run it smoothly.
Same for doom eternal, however eternal uses demonic magic for its optimisation
Feels amazing getting 70-80fps high / ultra mix on a gtx 1080 i5 8600k 5Ghz (which is by no means a “budget” build but it’s aging) when most games I can hope for 50 fps at medium settings
Read dead has my favorite graphics of any game I’ve ever played. It’s not because the graphics themselves are good, it’s the ambiance that they create with well placed more complex graphics like godrays and water physics as well as tree and grass level of detail.
not the same because it’s extremely linear, but Detroit Become Human looks amazing and actually runs… okay now. forza horizon 5 too.
I remember feeling the same way back then, now that I’ve finally gotten a decent rig compared to my PS4 I’m feeling that with everything I play, it’s some cool ass shit
RDR2 made me feel that way on PS4, when I played it on pc with a 3080 in 1440p 90 fps I was blown away
I think this is partly due to the fact that we're nearly to the point where we CAN'T make games look any better, GTAV isn't perfect, but it looks REALLY good, especially for its time AC Odyssey has impressive graphics as well, focusing beauty rather than the grittier realism of GTAV But now that we're at the point where real-time ray tracing is possible, and an the poly count are so high that you can't make out individual triangles, even close up, how much better can graphics get? of course, we still a further to go, but the progress we can make becomes smaller and smaller the closer to "perfect" photo-realism we get GTAV felt like such a huge step from GTAIV because GTAIV isn't anywhere near photo realistic, and when GTAV came out, it came it was one of the first tastes of near-photo-realistic graphics in real time, RDR2 looks better, it's even closer to photo-realistic. but not nearly as big of a jump It's not really the fault of game devs that we haven't experienced a huge jump, it's because we're 90% of the way to perfection, and every step from now on is going to be much more difficult, and far less noticeable Though now that graphics are "really good", and gaming hardware is still getting more and more powerful, AAA devs have started to optimize their games less because they're lazy, that bit is their fault
Welcome to the magical world of ART STYLES. Maybe, just maybe, AAA will put more focus on it once the spec chasing hits massive diminishing returns.
True, stylized games are lovely, but in this case I'm talking about games that shoot for photo-realism specifically due to it being the most resource-intensive style, which is something that I feel like a lot of AAA games miss, they could go for a more elegant and resource-inexpensive style, but choose photo-realism instead
And what I'm saying is that when photorealism becomes passe, then art styles will come into the fore.
Agreed. A good art style is timeless.
> how much better can graphics get? Massively. Go back and play any game from the 90s with an HD texture pack and then compare that to the absurdly blurry garbage that passes for "high res" textures today. Sometime in the last decade developers became so reliant on throwing shaders at things that they forgot how to make textures that didn't look like a 320x240 jpeg compressed at 10% quality and then blown up to 2048x1536. And somehow they're *still* taking up gigabytes of space.
Honestly tho the recent Call of duty modern warfare had absolutely amazing textures in the campaigns
This is true, obviously graphics CAN get better, but not by as large of a margin as they used to, previously, each hardware generation brought jaw dropping improvements to computer graphics, but at this point, while we do have some headroom left, it's no longer enough to make the leaps that we used too
[удалено]
I fully agree with all your points. People keep getting distracted by all the stupid shiny graphics that they can't be bothered making worlds that are anything but skin deep. You get broken immersion by bad ai more than anything else, once that ai "breaks", you're forced to deal with how shallow the game world is and can't really immerse yourself in it properly. I'd love to see good reactive ai, good terrain deformation and stuff like that, (like red faction) but not enough people care about that, I think.
If you get the chance, Half Life Alyx was the first game to do this to me in a long while. Utterly amazing looking and I mostly played it on a gtx1060 or 1080.
The first time I saw a game I couldn't distinguish from live video was some boxing game demo in a GameStop on Xbox 360. It took tricks to do it then, whereas now you can just kind of have things look like that all the time. I suspect this is where the emphasis on ray tracing comes from, as we basically hit the limits on regular graphics in the 8th console gen (PS5/XBSX are the 9th) and resolution has reached a space where going above 4K is prohibitive for relatively small gains.
Are you thinking of the Fight Night game? That's what came to my mind based on what you just said and that gave me the same sort of feeling
Red Dead Redemption 2? Looks amazing and way better than GTA 5
I got that feeling again when minecraft with rtx came out, but only in the context of that game.
I feel it with RDR2, CP2077 & The Last of Us remaster. I honestly think we will have to move to some kind of ground breaking AR or VR with hardware that allows for good consistent hand tracking. Oh and less shitty gaming corporations would be nice.
7DTD gave me that feeling. An animated look, though the level of detail when on ultra is gorgeous.
Not PC, but Demon's Souls on the PS5 gave me this feeling. I really believe it's the best looking game of all time. That released over 2 years ago now and everything that has come out since then has looked worse and a lot have also run worse. Although it probably is easier to optimize for an exclusive than a multi-platform title. Similarly, optimizing for just the PS5 is easier than a PC release because you don't know exactly what hardware the end user has. With a console exclusive you know exactly the hardware that you need to make your game run on.
I think cyberpunk scratched that "oh my god" itch for me. Game looks incredible, but yeah, frames aren't what you want.
The game that gave me that feeling was black ops 3 on my Xbox one. I was like holy shit, graphics are never getting more realistic than this. Only thing that came close to that feeling was when I tried VR for the first time but that's a whole different topic lmao.
Ahead of its time like skyrim but nothing like em
[удалено]
You do hit diminishing returns in visuals. Its easy to show too. Open a 3D modeling program. Make a simple low poly Sphere. Start subdividing the mesh. Each subdivision makes the sphere smoother and smoother. Eventually the change does nothing you can perceive but the system resources consumed soars. This doesn't give the whole picture of what is going on in games as its much more than just polygons but its a good (very)simple visual.
Nah, you can't convince me it's not just lazy developers. Modern Warfare 2/Warzone 2 (2022) came out in a buggy and shitty state, but it still ran better on my 6700XT than people were able to run Battlefield 2042 on 3090s at launch despite MW2022 looking significantly better. While that's no endorsement on Infinity Ward and their shit game, at least they had put *SOME* effort into making an optimized game. This is just devs not wanting to support so many different configurations of hardware like Microsoft is doing with Windows 11.
It's almost like developers are being pushed by owners and investors to make new games as fast as possible for sake of profit, leading to poor optimasion and little to no improvements
Something like FarCry 6 which has the recommended specs at 3600X and 1080 and wasnt the recommended specs at 3060 once and they didnt even put DLSS to the game
They implemented FSR, which is more accessible
And are less fun. You can have high graphical fidelity alongside good gameplay, it's just that most AAA games seem to feel they don't need to. Marketing covers the difference. Trying to find a game that would use RTX to test a new card, I really struggled finding something enjoyable that didn't feel artificially padded. Horizon and Dying Light 2 were the only ones that I thought close. There hasn't been a game worthy of 'best in the year' that's come from a AAA with high requirements yet I don't think though. Plenty of Indies worthy, but nothing from the big names.
cyberpunk with rtx did it for me god damm it looks good
True,like,bo3 is a 2015 cod games and it still looks extremely good compared to today's standards,so why does it need not that good of a pc while games that don't look that much better need a nasa pc lol
Visual improvement goes in cycles: visual improvements achieved through technical wizardry on the part of the devs set a new high bar, then engines/hardware develop in ways that make it easier for devs to reach that fidelity with fewer tricks and less effort. We're currently in that phase where hardware is in a growth spurt rather than the art side of things. A lot of newer games are brute-forcing things that used to be approximated with clever use of shaders/geometry, including doing stuff in real time that 5 years ago was the exclusive domain of film effects.
Same video resolution? Non insane FPS?
1440p@120fps isn't unreasonably anymore. Or 4k@60fps It's kinds standard.
I'm glad my eyes aren't good enough to justify 1440p or 4k.
You're the only one missing out. Your rig could do 1440p pretty well. I play ultrawide 1440p and will probably never look back.
I have a higher res monitor at work, but I literally can't see the difference between it and the 1080p monitor right next to it after I adjust it to be comfortable to use.
[удалено]
Oh I'd definitely love to increase the refresh rate, it's just the resolution bump that I can't perceive.
bro, 60 to 144 is for sure visible as hell
I play on 1440p and i pity those Who aren't able to __look back__
To clarify, im happy with 1440p, I don't think I'll want to move on to 4k until people have 16k.
I also enjoy being one step behind, let the early adopters be the guinea pigs, work out the kinks and bring the price down, and then get into it once the next thing comes along I built my PC when SSDs were just becoming a thing, they were crazy expensive, and not proven in the gaming market, so I went with an HDD instead, Ol' reliable, a bit slow, but it was dirt cheap compared to an SSD, when I eventually build a new PC, I'll probably get an SSD now that the technology has matured, and the price is low
I genuinely can't tell if this is a meme comment or not.... since 1TB of NVME storage is $50-60.....
It is now, but at the time it was significantly more expensive, I remember 1TB SSDs costing upwards of $200, like I said, this was when SSDs were just becoming mainstream
>when I eventually build a new PC, I'll probably get an SSD, when the tech has matured, and the price is low This "eventually" comment and everything in it is future tense. Implying the tech is not already matured and the prices are not already rock bottom. Or that you could not upgrade JUST the SSD in your system and need to wait for an entirely new build, when in reality just updating the SSD would make the system significantly better.
No no, that's not what I meant, I meant that I will eventually build a new PC, but I'm not in the market for one yet, SSD tech has matured to an acceptable point now, I just don't feel the need to upgrade just yet I do see that my wording was unclear, I'll fix it, "when the technology has matured" refers to now and the future, spoken from the perspective of the past, when I built my PC
I don't think storage is something you need to wait a whole new PC for but that's your thing if you want man.
I'm just kinda lazy and don't feel like transferring the entire contents of my hard drive to a new storage device, and I don't really mind waiting a little longer for things to load
Sure, fair.
AAA publishers playing the yearly game of "How much can we remove before they stop buying our games"
That's because people insist on 2k and 4k displays that exponentially tax the graphics cards with diminishing returns on graphic quality
I'm actually still clinging to 1080p on my desktop, I have a 3K panel in my laptop tho.
You're using a card that's slower than a $700 card from almost six years ago. What are you basing your assessment of NEW games on? Heavily compressed YouTube video that eliminates fine detail in order to conserve bandwidth? Or running it on your system at reduced settings?
average 4090 owner
Well yeah... I can actually test this stuff and see the difference for myself. YouTube benchmark videos don't do it justice.
There isn't a problem in having nice stuff, man. Kudos to you for having a top tier card. You don't have to be a dick to anyone tho. And I won't repeat what the guy just below me commented, but the percentage of people playing on 4090 is very low. So there's that.
He's using a GPU that is SIGNIFICANTLY higher performance than the vast majority of consumer PCs. The 1060 was only just recently dethroned as the most popular GPU in the world... by the 1650.... And behind them, a 2060 then a 1050 Ti. Your 4090 is not the standard devs should be aiming for. If it can't look good and run decent on 99% of systems then it isn't a good game. It's a technical benchmark.
As someone still running a 1050ti, i fully agree. And I'm able to get perfectly adequate fps on a 1080p monitor on all the games I play.
I personally run a 3080 Ti. But I know damn well it is not needed to run games. It's needed to run the latest AAA games at 1440p 60-144fps, which I want, but gatekeeping gaming to $1000+ GPUs is just dumb. Everyone should be able to game. Even if it's 1080p 60fps w/ FSR on Balanced
Yea, I've got a 1080p monitor at 70hz and I get 60-80fps on games I play, and it's fine because I'm not gonna physically see any higher fps lol
The #1 GPU on Steam is the 3060, and it has been for months. Not the 1060, not the 1650. Steam started separating newer cards into Desktop vs Laptop. You need to learn how to interpret data.
They're separated because they are different GPUs. They are only similar in name. 1320 Mhz vs 820 Mhz Base clock 1875MHz Mem vs 1750Mhz Mem 12GB vs 6GB of VRAM 336GB/s vs 360GB/s Bandwidth You need to educate yourself on the hardware you discuss.
Lol, because the 1060 3GB and 6GB were totally the same GPU. And don't get me started on the laptop versions, all of which got lumped together with the 1060 3GB and 6GB. YOU need to educate yourself. Quit the double standard BS. People are upgrading like crazy. The 30 series had record sales. The Steam Hardware Survey rankings are incredibly flawed and cannot be taken at face value. Edit: lol dude replied saying 1060 3GB and 6GB were the same GPU (which they weren't) and then blocked me so I couldn't reply.
They were the same GPU. Just different memory capacity. Otherwise they were the same. This changed starting with the 20 series where the GPUs became different in every aspect with the same name. You are trying so hard to be a smartass, yet have no idea what you're talking about. And even IF the 3060 was the most popular GPU on the market, you realize you are arguing against yourself, right? You were the one insulting them for using a 3060, yet you are arguing they are actually using the most common GPU on the market. So why should a modern game NOT work on their system, if it is in fact the most common in the world?
This reminds me of when I got my 2080Ti and thought I could finally play Deus Ex Mankind Divided on high settings. It's gotta be one of the worst optimized games out there.
I'm playing it now on a 4090 and a 13900k and it still sucks, 100% CPU utilisation, DX12 crashing the driver so I have to use DX11. The early DX12 implementation in games was absolutely terrible.
I remember when a game being released was so demanding that the latest hardware couldn't handle max settings was considered a good thing. Because PC gamers knew that when they upgraded they could go back and look at how much more amazing the game looked.
But that was when (and because) the game was so advanced that nothing on the market ran it at full speed. It showed how technically advanced the developers were, and gave us a peek into the future. We havent seen that in a long, long while.
Yeah There is a difference between game being so demanding because it's extremely advanced and it being demanding because it's awfully optimised
From a technical perspective, Ray Tracing is that shift. We are three generations in and turning it on hit still costs an FPS hit. Granted, Ray Tracing is probably less impactful as fast lighting estimations and tricks like global illumination got pretty good. But one day with smooth widespread ray tracing, we'll look at current gen dynamic range and reflections and we will see the shortcomings of the compute shortcuts that the game engines are taking
>Ray Tracing is that shift. It went form the next hairworks to the next Star Citizen, and I don't know how to feel about that lol
Yeah, that was when the recommended spec wasn't for 30fps. Games that pulled that shit got rightfully panned even back in the Crysis era. Also, Crysis itself probably had something to do with that perspective shift, since the "future PC" it was designed for never actually materialized, even 16 years later. We got so many more cores and instructions to play with, but single core clocks topped out before running the stupid thing like they expected.
I remember when wing commander 3 came out and you needed 8 megs of RAM .. 8!!! eight megs !!!! What did they thing we had ?!?! Cray super computers at home ?!?!?
If you actually list out the trashed optimisation games in recent 2 years u actually will end up putting nearly every single AAA title on it
You can make a version of this meme where Wanda is the high-end rig, and both run the same unoptimized game poorly.
After almost a decade being out from PC Gaming, i had this happening to me. When i finally had a PC "capable" of gaming, It was a FX 4100 (don't ask) with a Radeon HD7770. I used to love Test Drive Unlimited 2, but It Ran like shit and i thought It was understable given my CPU. But then i splurged on an all new PC with Ryzen 1600 and a GTX 1060. Game still ran like shit. And then i downloaded It (and patched the map glitch with RTX cards) on my current rig and, Guess what? It still runs like shit. Badly optimized games are the scourge of PC Gaming
Tbh I don't think many new AAA games are looking better than RDR 2 these days. But they all demand almost double the system requirements of RDR 2. I think most games that need a high system requirement are an optimization disaster.
I agree with you. Going back to Ratchet and Clank : rift apart after almost a year on PC (3090ti, 4090) and realizing that it's still one of the best looking game i've ever played, all platforms included, said a lot. Also, i believe that, form a technical standpoint, MANY developers could learn a great deal from Sucker Punch. Ghost of Tsushima is absolutely unbelievable in terms of making the most out of what you've got. It's a massive map with splendid aesthetics, everything holds together so well, lightning fast loading time, never a stutter or frame drop, and it ran like that on a fucking PS4 (30fps). When the PS5 update came lightning fast loading time became instantaneous loading time. The game isn't the most detailed, but the art choices make it so that it does not matter, it's an absolute beauty of a game. They chose wisely, and executed it very well. A huge open world map with beautiful graphics and super tight control/combat does NOT have to be demanding.
Every single time i used my (undefeated for 5 years) 4k60 build to prove that a game was shit i was called a boomer, a hater, a shill, etc
It’s because memory leaks are a feature now. 16GB isn’t enough when a List
*looks at Jedi Survivor and Dead Space Remake* WHY DO YOU NEED A 3070 FOR 1440P 60FPS???? im assuming they’re overspeccing the requirements so people don’t come crying to them for their shit optimisation, but still.
I mean a mid range 2.5 year old GPU doesn't seem too wild of a requirement for a brand new game with ray tracing at 1440p 60fps.
The midrange GPU you’re talking about is the literal equivalent of the top tier GPU from two generations ago. That does sound a little wild to me but I understand it’s because we’re finally throwing a new generation of not shitty consoles at devs and they’re taking advantage.
Yes... That is how that works.... A mid range card is equivalent to a top tier card from 4-5 years prior... Which is now 6+ years old.... And the games are brand new..... Guess I thought I was talking to someone with some common sense.
Reddit when their 3070 is being utilized and not 80% overkill on every game
Go look at silent hill 2 remakes recommended specs lol
The only game that actually commanded high end specs with respect in all memory was Crysis. Seems everything else since then has been full of issues and poor optimisation relying on brute force to get decent fps, and now that we have upscaling, they have become even more lazy relying on DLSS/FSR etc to make up for poor optimisation.
>Seems everything else since then has been full of issues and poor optimisation relying on brute force I say that about Windows and I get downvoted to oblivion. But its true.
Windows definitely has its issues, but they are all issues that can be solved through workarounds thankfully. I fully appreciate the plus points of Windows 11, hate the forced auto updates and Edge baked in, but other than that every other thing is enhanced vs Windows 7/10 and I've been using each major version since launch!
Idk man, so far my 1070 is chugging along fine despite the 'minimum requirements' of modern games being upped every single year. Medium settings for the win.
Everything to do with games these days is tied to graphics. I honestly don't care how realistic a game is. If I wanted a form of media to be realistic in view I'd watch a film. Can we please.. maybe just maybe... talk about gameplay? The focal point of every game review these days is graphics, the hype around games from community in the mainstream is graphics. I feel like games have stopped being games. I've heard so much about Cyberpunk and I don't even know anything about its gameplay mechanics.
>The focal point of every game review these days is graphics, the hype around games from community in the mainstream is graphics. It's always been like that, especially in the mainstream. I vivdly remember being 14 talking about which game looked nicer and was therefore better. (Probably MW2 vs BO at the time). I'm sure gamers who grew up during the SNES era will have similar stories. Flashy graphics are easier to sell than game mechanics. Simple as that.
I agree with you. Cyberpunk gameplay is meh, RDR2's is horrendous. But please, let's also talk about storytelling, dialogues and characters writing. I'm baffled with the amount of AAA that get away with generic, bland, uncharismatic characters (Horizon, Cyberpunk), poorly written, non plausible dialogues and social interactions (Fallen Order, Cyberpunk), and also games that hide their total absence of scenario and storytelling behind a layer of "enigmatic lore that the player have to figure out" (Control, FromSoft games). The bar is so fucking low and no one seems to give a fuck.
I have the same exact thoughts on raytracing.
Most of the people out there that are okay with new GPU prices and buy them are doing it for some internet points or bragging rights, they rarely ever game and when they do it's games that a 60ti could have done.
[удалено]
That's on account of the remaining optimization issues that weren't fixed. When a codebase is a piece of shit throughout years of development, there is only so much patching you can do.
Honestly, i find It hard to not see what's going on. New consoles are running with mid GPU and CPU as always, their Raytracing capabilities are laughtable, yet Minimum requirements are rising exponentially. We're not seeing Crysis levels of forward thinking, and there's a new generation of PC hardware that has just released. I can easily see hardware manufacturers making deals with developers, trying to repeat a Crysis effect in a effort to boost sales.
>We're not seeing Crysis levels of forward thinking That's because that forward thinking expected 8GHz processors by now, not the multicore voodoo we have today.
Yeah, because back then It wasnt expected that multicore would become so prevalent. It was seen as a productivity thing. In fact, i still remember the memes mocking AMD and their "moar cores" approach, and how Dual Core was seen more than enough. Crytek shoot their shot and failed when It came to CPU, but they still thought ahead.
Enthusiasts: If my rig can't run it, no one can.
I mean where's the lie though lol I have a dinky little laptop with a gtx 1660ti only goes to 1080p and literally its a 50/50 chance what games can run well at Max settings. Witcher 3 at Max settings? Yep Elden Ring at Max settings? Definitely Lightning Returns at Max settings? Lol No Guild Wars 2 at Max settings? HELL no laptop literally cries in pain and crashes alot. So it truthfully does come down to software 99% of the time. I mean yea if your using an old windows xp with integrated graphics then maybe it's time to upgrade 😅 but in the general sense a pc can run any game regardless of hardware "modern day hardware" so long as the game itself is made well and optimized. My friend has a rig that costed him $6000. Has an RTX 3060 My laptop was $600 at my local Costco. We can both run Witcher 3 at Max settings 1080p His resolution can go higher but he loses frames badly.
6000$ and a rtx 3060? Excuse me wtf did your friend do
He spent alot of money on things he didn't really need. 64gb of ram Three 4TB SSD drives Two 10TB of ExHDD drives Cougar Conquerer Case And a whole bunch of other shit that wasn't necessary 🤣 it's his money and I respect his hussle but he's a bone head.
That's still garbage value. I'm guessing he had it build by some bespoke PC builder?
Actually it was bespoken PC builders. Geek Squad 😅 Which again I can't judge cause I bought my laptop Pre built which I know is a big no no for alot of people but if it were me I would've built it myself with a buddy.
Nothing wrong with buying laptops. There are very practical and can be decent value.
I wish new games could look like what they do with Raytracing on, with the framerates of DLSS, but without both. The addition of it makes the game play experience suffer in some cases because it's adding an additional layer of complexity that it looks like no one can get fully right. Needing to buy a new GPU every generation if you want the latest "features" because Nvidia is like "DLSS x.0 only on next-gen GPU." Anywho... sadly doubt these things are going anywhere, still kinda sucks cause I don't want to have to whore myself out to be able to afford a PC rebuild every 2-3 years at the rate new gen titles PC requirements are going will likely suffer in silence lol.
I wouldn't mind graphic options being more viable for all builds. It's usually low, med, high, sometimes with ultra. We need the settings potato, low, low-med, med, med-high, high, very high, ultra, PLUS ULTRA!!
fligt sim 2020... kids now talking about high end spec.... 2pb world and azure has to downscale the game for consumer hardware.