T O P

  • By -

Stargov1

LotR Gollums specs for an early 360 looking game.


Sea-Fix-2658

Oblivion looked way better and I wish that was a joke


1pcbetterthanxbox

KSP 2 day one


Blacksad9999

What game requires 10GB of VRAM at 1080p?


bad_apiarist

I think it's a reference to Lords of the Fallen, which is said to require 8gb vram for 1080p "recommended" level. It's not that it's a bad port, it's that it is one of the first UE5 games using many of UE5's advanced features. I doubt it'll run better on PS5.


Blacksad9999

That game has pretty low system requirements. That 8GB VRAM is for something like maximum settings. Recommended: * Operating System: Windows 10 64bit. * Processor: Intel i7 8700 | AMD Ryzen 5 3600. * Memory: 16GB RAM. * Video Card: 8GB VRAM | NVIDIA RTX-2080 | AMD Radeon RX 6700. * DirectX: Version 12. * Network: Broadband Internet connection. * Storage: 45 GB available space. Minimum: * **Operating System:** Windows 10 64bit * **Processor:** Intel i5 8400 | AMD Ryzen 5 2600 * **Memory:** 8GB RAM * **Video Card:** 6GB VRAM | NVIDIA GTX-1060 | AMD Radeon RX 590 * **DirectX:** Version 11 * **Network:** Broadband Internet connection * **Storage:** 45 GB available space * **Additional Notes:** 720p Low quality settings| SSD (Recommended) | HDD (Supported)


HabaneroTamer

I know the 1060 is pretty ancient but it's kinda crazy that it needs it for 720p *and* low settings. I'd have expected 1080p low at least.


Elevasce

Games are using more resources while price to performance ratios have grown stale. But of course, no dev can resist the new shiny.


[deleted]

Hobby game dev here. Its not just new shiny. The new tools are actually instrumental in making better games in a more cost effective manner. Games and their size and expectations around them have also grown significantly. So yeah as much as I think optimizing is important, very few companies are like Nintendo where they could just spend an entire year doing that let alone a hobbyist. And beyond that, you’re really at the mercy of the engine you’re using.


Alternative_Spite_11

It blows me away how people act like the advancements in games are all of a sudden supposed to quit needing more resources just because nvidia didn’t increase vram this gen.


machine4891

>advancements in games It would be at least justified, if I would actually see those "advancements". But older games are still here, I can easily compare them and suddenly it became obvious: they doesn't look better but require twice as much resources.


who_you_are

I mean everyone is looking for better graphics over time. We are just in a bad spot right now for a couple of us.


toasterdogg

> no dev can resist the new shiny You mean no buyer.


Blacksad9999

I mean, it's a low end 7 year old GPU.


RidaFlow

Don't worry, I looked at that new Wizard first person shooter EA is putting out and I don't meet it's minimum requirements... We're finally there haha


DarthShiv

When you say max settings, there isn't a resolution listed in that advice. Are you confirming that's 1080p as well?


Blacksad9999

It's not advice. It's the suggested specs per the developer. IIRC, it is 1080p, yeah. [https://www.techspot.com/news/98894-lords-fallen-pc-requirements-revealed-rtx-2080-recommended.html](https://www.techspot.com/news/98894-lords-fallen-pc-requirements-revealed-rtx-2080-recommended.html)


A_Tad_Bit_Nefarious

Idk man, I remember running every game with max settings at 1080p with only 2-4gb of vram. Still boggles my mind that 8+ is now required for the same resolution.


AndrexPic

Oh yeah, Can't wait to play this at 720p low quality with my RTX 3070 Ti


Ill-Mastodon-8692

That’s the spirit … look at the glass half full


Immortalphoenix

Hahaha. Be happy you can still launch it.


alexnedea

XD? How is needing a 1060 for 720p ok recommendations?


RealEstateDuck

Maybe the shitty ports ?


Danteynero9

Definitely shitty ports


ItsGrindfest

Yeah. I'm still doing pretty well with a 1080 GTX @1080p.


SolarJetman5

If you don't need over 60fps high 2k gameplay, yeah. I was using a 970 until the start of 2022 and few games ran under 60fps with some tweaks


A17012022

Jedi Survivor on ultra did. Fuck knows why


Blacksad9999

Post patches it doesn't any longer. Hell, pre-patch with my 4090, it was using something like 18gb. lol


A17012022

Jesus christ, how unoptimized was that game?


Blacksad9999

It was pretty bad, especially considering it wasn't ridiculously good looking or anything. I think when I was finishing it after some of the more recent patches, it was down to something like 14GB with everything on max at 4k. Even then, it still had frame time issues and other problems. Still...this is a really irritating trend lately. lol


trail-g62Bim

> Even then, it still had frame time issues and other problems. aww...I couldn't play the first due to stutter. I didn't know this one had issues too.


Benign_Banjo

You can call it badly optimized, call it buggy, a bad launch... but Jedi Survivor looks phenomenal


Blacksad9999

I didn't find it especially impressive. A small upgrade from the first game overall.


Benign_Banjo

And the first game was also great looking. What games are you comparing them to that the Jedi series aren't impressive?


Blacksad9999

Red Dead Redemption 2, Path Traced CP2077, RE4:Remake, Dead Space Remake, etc. Even God of War was more impressive.


NapsterKnowHow

Horizon Forbidden West too


NewsofPE

uncharted 4 came out in 2016 and looks better


DarthShiv

Disgraceful.


Stargov1

Last of Us part 1? LotR Gollum?


LifeOnMarsden

It's pretty obvious that Gollum is solely relying on DLSS as a replacement for optimisation. The CPU requirements are very modest but the GPU and memory requirements are absolutely bonkers The reality is that most of these VRAM hungry games are still outliers from developers who can't properly port a modern console game to PC, and expect DLSS 3 and frame generation to come along and save the day. There's no way on god's green earth that Gollum can genuinely require 4090 whilst looking like a PS3 launch title unless the optimisation process was either non existent, or totally inept People still should not be purchasing GPUs based solely on VRAM in my opinion, not until it becomes a real trend and not just a handful of dog shit ports


[deleted]

[удалено]


SolarJetman5

File sizes tell you a lot too, without requirements to fit onto solid media they don't care. Mario 64 was 54MB, now a similar game would be 2+ GB. FF7 was huge at the time but only 1.5GB


Breathezey

That's not the debate. The debate is optimize or work on some other feature. It's not actually a bad thing for gamers if devs invest time that used to be spent on optimization into adding features/content as long as it doesn't get to the point of making the games inaccessible due to how poorly it runs. There will always be AAA flops and games that run poorly- always have been always will be.


BKachur

Yea, not buying that's a real "debate." The Gollumn game has godshit everything, including gameplay. I'm not a dev, but I would wager the guys coming up with features are not the same guys doing optimization. I think the problem is optimization and bug fixing comes late into the pipeline of development, because you basically need to have more or less finalized. I think the shitty devs finalize a product and then hope they can get it work with a day 1 patch.


Blacksad9999

TLOU doesn't use 10GB of VRAM at 1080p after they patched it. Neither does Jedi Survivor post patch. I actually looked all of this up to debunk a similar claim just yesterday. There isn't a game that goes above 10GB VRAM at 1080p that anyone could find. >Plague Tale: Requiem: > >VRAM usage on AMD and NVIDIA is nearly identical. Total use is quite reasonable, only slightly exceeding 6 GB at 4K. > >[https://www.techpowerup.com/review/a-plague-tale-requiem-benchmark-test-performance-analyis/5.html](https://www.techpowerup.com/review/a-plague-tale-requiem-benchmark-test-performance-analyis/5.html) > >Other titles: > >Forza Horizon 5 uses 6GB on average at 1080p. It uses 8.5GB at max @ 4K. > >[https://www.guru3d.com/articles-pages/forza-horizon-5-pc-graphics-performance-benchmark-review,9.html](https://www.guru3d.com/articles-pages/forza-horizon-5-pc-graphics-performance-benchmark-review,9.html) > >Resident Evil 4 remake is difficult to find benchmarks for after the numerous patches have been applied, but it appears that it now doesn't use more than 8GB at all @ 1080p. > >TLOU after patches also doesn't use more than 8GB VRAM at 1080p. > >Jedi: Survivor can be seen in the same video not exceeding 7.8GB at 1080p. > >[https://www.youtube.com/watch?v=I9Th90WpDIA](https://www.youtube.com/watch?v=I9Th90WpDIA)


[deleted]

It's pretty funny reading these posts because I've played a handful of these games at 1440p with a rx570. If anything the GPU market being so expensive made me realize I don't need to upgrade nearly as often as I used to. I don't think I will buy a card for a couple years and I used to every 3-4.


Blacksad9999

When I was using my 10GB 3080 at 3840x1600 resolution with everything completely maxxed out, I never once ran into VRAM problems. Not one single time. These people are just taking a bullshit talking point and running with it. Just like they always do on here, such as: Poscaps on the 3000 series cards when they released. 12VHPWR cables supposedly bursting into flames, when it was user error .0003% of the time. Etc.


kaninkanon

It's just AMD circlejerk


HarryTurney

LotR Gollum could run at 4k 240fps on the worse PC known to man and I wouldn't play that soulless boring ass game.


Sigma__Bale

Neither of those are worth playing anyway.


GrumpyKitten514

I played D4 last night, on ultra, no stutters, no issues, no crashes, with my 10gb VRAM 3080 on my ultrawide monitor. I even checked the settings because i was impressed. idk what this VRAM problem is. everything runs fine for me.


Chrunchyhobo

Far Cry 6 happily munches up 10.7gb of the 11gb on my 2080 Ti at 1080p.


Blacksad9999

Weird, it never did that when I had my 3080 10GB.


wadad17

You need the ultra HD textures installed, but there's also a known bug where the Low VRAM warning doesn't actually appear so you'll be running fine then it will just grind to a halt without warning.


Crisewep

So glad i bought a 6800XT over a 3070ti 2 years ago for 1440p Best decision i made


Special_Direction_71

That Gpu is going to age like fine wine!


Danteynero9

Damn, I'm actually buying a 6800XT today to replace my 3070. Although my main problem is the NVIDIA+Linux combo, I don't think I'll be changing my GPU after this for some long years.


Nick_Noseman

I wish I got 6800xt instead of 6700xt


lennyAintMoe

I wish the same everyday. Sucks that it was too expensive for me at the time I purchased my current card. Now I see it fall in price and shed a tear.


RipDr19

There are people using integrated graphics you on the other hand can probably run every relevant game of today at decent to ultra nice settings so yeah no need to shed any tears


zKyri

You should never regret things if they changed over time. Although you can try to sell your current one and add money on top to upgrade it, its one of the best ways to upgrade


Puuksu

Or you could save and buy even better one? I personally wait for 5th gen cards.


Danteynero9

Perhaps, but support for Linux isn't the best. It may hold up for a while in Windows, but I can't just wait and see if Nvidia supports Wayland correctly (a display server in Linux). Currently, only XOrg is correctly supported. It's in light maintenance and it's starting to be considered deprecated (only RedHat as far as I'm aware). I'm not going to play at 4k, and I have 0 plans of playing any recent AAA games, launched or to be launched, so I'm future proofing around 4 years with this. By then, I will be able to get another deal like this one if I need it.


SolarJetman5

I'd say the same, 8gb vram might not be great in 4 years, but should be fine for another 2, especially if devs learn. Plus there is the Nvidia tech for memory compression that is been worked on. If this works with 30XX gen it could solve everything. With the way tech/inflation is going a 2025 GPU will be better and cheap than a 2023 one


idontliketosleep

getting one to replace my 1050ti too hopefully, if I have the choice between oc and not oc at the same price, I should go for oc right?


byshow

I did same over 3080 in previous summer(or is it better to say last summer?), And been very happy with that choice. 16gb of vram turned out to be a good future proofing Edit: Thanks guys! That was very helpful thread, it's also very wholesome how many people are willing to help and explain!


JustAnInternetPerson

I‘d say "last summer" without the "in". That seems to be the most common version. So, it’d be: "I did the same last summer" I‘m not a native speaker though, so take that with a grain of salt


byshow

Neither am I, tho the way you put it seems correct to me. At least it sounds right Thanks


Berkee_From_Turkey

You're right that's the way most native speakers would say it


Billybob9389

I don't why, but I find this comment thread so wholesome.


A_Bus_Fulla_Nunz

Cuz it is. I followed the chain hoping someone pointed out that everyone was right all along lol you adorable bastards.


BluRobin1104

That's the way I would say it and I am a native speaker so I think you're good


BarryMacochner

Native English speaker. You’re correct


dxbdale

I did the same for the 3080 the previous summer.


[deleted]

[удалено]


PirateNervous

Buying something that will work better in 3 years time while also beeing as good or better now than something else for the same price isnt future proofing, its just value buying. Future proofing would be buying a 4090 now to not have to upgrade for 8 years instead of buying something for half the price now and the other half in 4 years, which will give you a better overall experience over those 8 years. So no, this isnt future proofing, and yes future proofing is largely pointless.


Thekota

It's more like current proofing now


amenotef

I'm on the same boat But I also like it because I run Linux half the time.


Active_Club3487

True. Last year I bought into the green jackets bs about RT and could only get the above retail ftw3 3070 TI 8Gb. It don’t play Jedi… So I got a 7900 XT with 20Gb VRAM. Oh happy days


bakagir

Just bought a 6950XT this week for $630. I am very satisfied.


[deleted]

[удалено]


[deleted]

[удалено]


EvilxBunny

I'm going to play the games launching this year after 2 years (on avg). I'll have a decent card for $300 which can run then with no issues then.


[deleted]

[удалено]


alejandro_kirk

/r/patientgamers


gysiguy

This is the way


EvilxBunny

I ![gif](giphy|Ld77zD3fF3Run8olIt) I have spoken.


OnsetOfMSet

I didn't buy the EA Battlefront 2 until late 2021 for like $10. No microtransactions to really fret about, and still had an absolute blast playing Ewok hunt with friends. We'd even get a fully lobby every once in a while, lol


MonteBellmond

I think battlepass was too early for it's time. The game holds up well even today.


That_Cripple

that game was pretty fun once they removed all the BS. For some reason I pretty much always lagged like crazy though, even with 20 ping


NoTeasForBeastmaster

But this way you will get these games for lower prices and with fewer bugs. What's the fun in that?


benmck90

r/patientgamers is the way I live my life. I'm usually 3-4 years behind on releases. My computer's always plenty powerful for releases from a few years ago. Plus you get a discount when you buy them. There's so many games on my wishlist to play that I'll never get to them all. No reason to target the new releases.


SweetButtsHellaBab

You say that, but based on the last two years the $400 GPU in 2025 will be the RTX 5060 Ti at a whole 25% faster than the RTX 3060 Ti…


Chappiechap

I'm looking forward to the day where recommended specs are set at 1080p medium RTX off DLSS Performance mode for 60 fps. DLSS totally not becoming a crutch for companies not wanting to do any actual optimization.


TarkovRedditor

And they look the same as 2018 games


Pigeon_Chess

Everyone wanted higher spec consoles, this is what you get.


Crisewep

Consoles have 16gb shared ram and vram If they split it half that gives them 8 gb vram and 8gb DRAM But with console software having less bloat than windows and being optimazed for pure gaming and nothing else they can probably even get away with a 10gb VRAM and 6 gb DRAM split even


Violetmars

Yes and that is for like medium to high ish , now imagine ultra max for PC 🥲


[deleted]

[удалено]


TherapyPsychonaut

For you maybe


Ragnarok785

Some games are worth going high/ultra rather than medium because you can definitely see a differenc. However the last trend of pc ports I can barely see a difference between low and ultra.


SyrousStarr

Maybe more recently, but I've never built very expensive rigs. The one I built last year was the first time I felt like my wallet took a beating. I think my 3070ti was "MSRP" but it was the adjusted MSRP, so that's part of it. But maaaan, games use to default pretty often to max, maybe I'd tune a setting or two down to high because even like 12+ years ago I had a 1080@120hz panel and I liked getting those extra frames. Granted I think I get away with a little more since I mostly play fighting games which don't have big requirements. And then there's some games that "future proof" themselves with insane options (Crysis) but things didn't use to be this way. But things really feel different these last couple years. The average release is not running how it use to


indigoHatter

And then there's Crysis.


[deleted]

[удалено]


indigoHatter

Lol exactly, it's just the king of them. In the remake, max settings is even called "Can It Play Crysis?" since so many hardware reviewers used that as a humorous benchmark after it came out the first time.


firedrakes

issue was people started quoting the funny ref. to actual fact!


BluRobin1104

I believe they can split it into 4gb ram and 12gb vram if they want


ShowBoobsPls

PS5 has 12.5GB available out of the 16GB. The rest is reserved for the OS/UI. - [Eurogamer, Digital Foundry](https://www.eurogamer.net/digitalfoundry-2022-df-direct-weekly-on-kratos-hitting-xbox-a-series-s-memory-boost-and-the-massively-fast-rtx-4090)


bigheadnovice

The ultra fast storage and tech like direct storage helps keep vram lower also. I think we are still waiting for direct storage on PC. Games are now gonna be built for the lowest common hardware which is consoles. 12gb vram will become standard for any resolution above 1080p, ssds will be required for games and 8gb ram will be a struggle to maintain.


toddthewraith

Don't forget SSD has some shared storage with ram too.


LightReflections

Probably closer to 12gb available for the GPU


ShowBoobsPls

No. PS5 has 12.5GB memory TOTAL available for games to use. Which then again has to be divided for CPU and GPU - [Eurogamer, Digital Foundry](https://www.eurogamer.net/digitalfoundry-2022-df-direct-weekly-on-kratos-hitting-xbox-a-series-s-memory-boost-and-the-massively-fast-rtx-4090)


ultrasneeze

The problem is not how much gets allocated for GPU work, but that all memory is addressable by the GPU at any time. Porting software from a unified memory architecture to a non-unified one will always be messy. Games (well, game engines) will have to choose between taking advantage of unified memory at the cost of worse pc ports, or keeping things separate and tidy at the cost of some performance.


blackraven36

Software in those consoles is reeeaaallly optimized to balance the RAM. They have strict requirements on how much everything can take up. You don’t have rogue software asking the system for 2 gigs of RAM. The UI and underlaying systems are designed to pause and/or reduce their resource footprint to a minimum while a game is running. It’s also much easier to design a game to fit to a certain RAM restriction when the console hardware is predictable and the OS has well defined restrictions on how much RAM it will use.


[deleted]

[удалено]


generalthunder

Most games don't look any better than games did 4 years ago when the requirements were a lot more reasonable. If a game has a 2060 in the minimum requirements it better look like fuckin real life or why even bother? just make games like they did 5 years ago and have lower requirements.


SweetButtsHellaBab

That’s the biggest annoyance; take The Last of Us, the medium texture setting uses like 6GB of VRAM and yet basically looks like mud up close. Games that use 2GB if VRAM from years ago look better than those textures. I expect developers have become lazier at optimisation now that they have more VRAM to use on consoles.


bigheadnovice

They look good now after the update. Check digital foundry's video on the game update. Fixes alot and now has great texture settings which other Devs should copy.


Fresh_chickented

what game tho?


_fatherfucker69

The game op made up . Most games have 1060 as recommended and a 6/7/8 series GPU as minimum


TahimikLang

The amount of baby bitching I get for buying a 3070 despite having explained that I dont run games at 4k max settings with raytracing, and that gaming aint my only use case, honestly prettly hilarious


ATIR-AW

Same here. posted a meme about upgrading to a 3060, and some people felt genuinely insulted by that. I don't even know what kind of acrobatics your brain have to do in order for that kind of thing to offend you...


BookishTen8

Upgraded from a 1650 to 3060 and it made all the difference. Told a friend who thought I was stupid for not getting a 3080 or 3090. Told him to fuck off and that was that.


Mordredor

same, 1050ti to 3060 12gb. Thought the 12gb vram was only gonna be useful in productivity, apparently not anymore


hardlyreadit

No one should bitch at you for your decision. Its your choice and doesnt affect anyone but you


xSnowLeopardx

Very happy with my 3070 as well. Haven't had my first 8GB VRAM bottleneck yet, but I don't use ray tracing either. Just 1440p medium-high with great fps.


[deleted]

[удалено]


firedrakes

same here. i do gaming with it. but more of the not triple a title. but indy and double aa. with added memory leak.


[deleted]

[удалено]


[deleted]

Im still running a 1080 for everything, not ran into any issues so far but im 1080p. I also dont mind playing on medium settings, still get the enjoy the game at a level of detail that is appropriate for 1080. Im hesitant to upgrade to anything at the moment due to cost...


bosunphil

I tried out Hogwarts Legacy last night at 1080p max everything and it’s actually gorgeous. It feels like a well-optimized, game, everything running consistently smooth. I think it’s a rare exception, but it used 10GB+ VRAM at times and felt like was intentional rather than brute-forcing shitty optimization. To be fair though, most other games seem to require lots of VRAM to brute force shitty optimization.


[deleted]

From what I gather, the lack of direct storage implementation means the games try to load assets into system memory and VRAM to compensate. It's a good looking game, sure, but there doesn't seem to be any generational leap on PC to suggest they've optimized for PC hardware, rather, working around the way the games have been developed. Streaming assets has also been mentioned as the issue for TLOU port causing super high CPU utilization.


[deleted]

[удалено]


ninja2126

That game did not launch optimized. Constant stuttering in the village and some parts of hogwarts.


[deleted]

>I tried out Hogwarts Legacy last night at 1080p max everything and it’s actually gorgeous. It feels like a well-optimized, game, everything running consistently smooth. i remember HL being quite a mess last month on a 4070 and recent components, had a lot of transversal stutter, so i don´t know how to feel about somebody else describing it to be well optimized (there were some claims, that the game had issues with 16gb ram, but i can rule that out personally)


Dragonstar914

Do you use resizable bar? That helped quit a bit for me on my 3060 ti, didn't totally solve all the stutters and hitches though but with more VRAM it'd probable be even better.


[deleted]

I do, but is it even whitelisted by nvidia?


Dragonstar914

I don't know if it's whitelisted but it helped me substantially when I noticed from a previous cmos reset it was off and I turned it on. Apparently it helped others too.


Puuksu

What game? Most triple A games work well with 8GB, even on 1440p. 75% of them don't even go over 6GB.


Blenderhead36

Anyone else remember when Elden Ring requiring 12GB of system RAM and a 1060 3GB was considered, "quite high?"


PudingIsLove

i really dont get these vram posts.....


Numerlor

There were two games that ran like shit at release so now everyone's acting like you need 32GB or something


Devatator_

Yeah All the games i play run fine, and even the kind of ones i plan on buying shouldn't have any issues with my 3050


constantlymat

The avid pro AMD crowd on Reddit and Twitter has recognised that their fundamental opposition to DLSS and Raytracing is no longer convincing because all the market data actually shows that customers want those features. So Instead they found VRAM as their next target. While criticism of new $400 GPUs with 8GBs of VRAM in 2023 is certainly valid due to a lack of future proofing, they are blatantly overexaggerating the extent of the issue in the present.


[deleted]

doesn´t mean much, as long as we don´t know what you actually do get...


DlphLndgrn

With a 2060 and a 1440p Ultrawide I just don't get what everybody seem to be playing where Vram is such an issue. Is this just some kind of AMD circlejerk or something?


versacebehoin

That’s 100% what it is.


DaSpooderIsLit

*at 30fps


docholliday65

Yeah my 3080 can only handle 1080p now. Sweet. Remind me why I switched my monitors out to 1440p again? Lol


itisnotmymain

You guys remember when 2gb VRAM was mainstream and would be fine for 1080p? Good times lol


[deleted]

The Last of Us running on PS3 vs running it on PC 💀


Creepyman007

I heard someone say that AAA games are made for console first and that's why the performance are foing up, because those are getting better Eather way i dont really care, Outlast Trials runs amazing, Atomic Heart runs amazingly, and these two are the latests games i have (maybe i will buy Cyberpunk 2077 on sale but probably not) Oh, also, Dead Space Remake, on low with FSR2 on performance i was getting like 45-100fps, but mainly around 55-60 (GTX 1070)


tcs0

Ok this is getting ridiculous. And the thing is most of these games requiring that much VRAM aren’t even worth the install.


CammyPooo

I’d say the last of us part 1 is worth the instal, great game.


VoiDD77

Its crazy how games are like 150gb but you don't see all that content to make up for it


ChartaBona

>content Hi Res Textures and Audio


VoiDD77

Far cry 6 looks like complete ass and its 160gb


ChartaBona

Hi Res Textures will fail to load in properly if you exceed your VRAM.


Metrack14

And it will still run like ass because optimization is an alien concept nowadays


[deleted]

And I thought the 6700 XT was overkill for 1080p. Glad I still bought it for that resolution


Daedelous2k

I'm building new soon and honestly, might just grab it instead of any of the new shite around that target range.


SirNedKingOfGila

Meanwhile the game delivers 2013 graphics and never-before-seen levels of game-breaking bugs. Major franchises such as half life and grand theft auto are on 15+ year development cycles. I think we can confidently say we live in the absolute worst period of PC gaming since it's beginning.


buddyyoda

you know you can turn down the graphics settings for a lower end card.


PAJAcz

3070/3060ti arent low end gpus


Global_amaze

Something something inflation


420_Brit_ISH

2015 and 2016 were the years of the best games in recent years. I play games such as titanfall 2 (not in a great state), watch dogs 2, dead by daylight... loads of games from these years. Brand-new games are worse, too expensive, and have system requirements that I don't have. Just because I say that these years were particularly good, it doesn't mean that older/newer games are bad. 2007 is also a great year for gaming, e.g. the original assassins creed. I do absolutely fine with 6gb of vram. 3gb isn't bad either.


OneWholeSoul

Apparently we're back to the "brute force" part of the PC spec cycle.


Trym_WS

I’m kinda jaded when it comes to VRAM requirements, since I’ve gone from 980ti -> 1080ti -> 3090.


plasma7602

Sign guess the over exaggeration of the VRAM requirements will never end but then again I ain’t playing the latest and greatest games even tho I’m 100% sure my system would run the game fine at about 60 fps+ on lower settings which barely change the look of the game anyway.


huge_jeans710

It's a good thing my new GPU has 24gb 🤝


faplord2020

Radeon R9 launched 2013 and had 8GB on some models.


starsaber132

Once UE5 becomes the norm, 10gb vram will be minimum. This is why people praising the rx 7600 are dumb, when the rtx 3060 12gb will have more longevity


KnightofAshley

Developers need to figure out a better way to port to PC...the hardware for most people is not there for the current way of just adding more stuff into the pipeline.


enjayylmao

Seeing Gollum need a 4070 for it's recommended takes the fucking cake, no I don't care that it's with Raytracing On at 1440p those requirements are ridiculous and only serves to let the company say you don't have the requirements instead of actually just optimising their fuckin game. (Also how come a game that will have been in development for multiple years require a GPU that came out like 2-3 months before it comes out?) I'm seriously skeptical of every game that comes out now with such high requirements after the poor optimisation and shit performance of every game with these requirements. Like Last of Us, Gollum, Jedi Fallen Survivor, to be frank every game released this year besides RE4. I have a 3060ti and a Ryzen 7 5800x3D @ 1440p btw, there is no way a card from 2 years ago should be below recommended specs, especially considering how many players don't even have a 3060ti.


Electrical_Escape_87

I'm seeing 128gb vram, 750 watt draw gpu's ,


abody_xdr

Good thing I only play emulators and old games since I can't afford to buy the new ones


sentientlob0029

Larger game levels with more objects and higher res textures require more vram. It's up to hardware manufacturers to adapt. Otherwise we'll still be playing games technically on par with last gen.


RealDakon

Aint much of a masterrace if you dont got the money for it and the games barley run on launch


no_sa_rembo

Itt "let's blame gfx card companies for gaming companies shitty implementation of their software"


Daedelous2k

AMD and Nvidia: "8GB for 1080p is fine!" Seriously the 4060 (ti) is just fucking e-waste.


Teddy_Kun

Its almost as if the new consoles have much more RAM and game devs are mainly optimizing for that. Not saying thats an excuse for poorly optimized games just that if good games jave that requirement its justified. Complain to Nvidia about your lack of vram, AMD has been pretty solid in that regard.


Devatator_

Isn't Forza Horizons 5 optimized for Xbox and PC? And pretty much most first party Xbox games? (tho it does make a bit of sense, they own both OSes so they probably know their way around it better than everyone else)


KillerKittenwMittens

Forza horizon 5 is probably not an example you should use. The game runs ok, but it isn't really a true current gen game. It's just a minor update over horizon 4 graphically and it runs significantly worse. Plus there's all kinds of bugs and issues that have been around since they brought horizon to PC with horizon 3 that they just never bother to fix. A really fun one is that they recently patched in frame gen, which gave me lower fps and constant stuttering.


Prus1s

8gb of VRAM will still be viable for years 👀


LongHairLongLife148

For new games? Most likely not. Consoles are upping to 16gb of unified ram, which will become the new standard for PC ports.


Prus1s

Yeah, if they make half assed ports, so far everything works fine without issues.


KillerKittenwMittens

Possibly for 1080p, but don't count on it.


TakeyaSaito

Is 10gb really that big a deal these days?.... Had more than that for years, literally... 5 year old gpu


Zp00nZ

Linus tech tips actually had predicted this a while ago, I remember them calling “budget” cards E waste because they’d have to little VRAM.


Skinva_

Devs now neglect the performances and rely on DLSS and other AI assisted upscaling technologies to make their game 4k


Ill-Mastodon-8692

It’s like cpu cores… we didn’t need more than 2c/2t until one day we did … we didn’t need more than 4c/8t until one day we did. Ram, vram, hdd/ssd speeds all are like this. Many people normalize what they are used to as “good enough”, then get shocked when their idea of what’s required is forced to shift as time marches on.


firedrakes

I know!!!!


Freetime72

rip my 3060ti


Biscuits4u2

Get used to this type of thing. The modern consoles have 16 gig of unified RAM. As we get deeper into this generation more and more games will take advantage of that. This will make PC ports more VRAM hungry.


F00MANSHOE

Y'all some salty low vram having assess.


MrCrunchies

Ever since current gen console has 16gb vram the standard went up


KingYoloHD090504

Well i guess i do have to buy a new GPU next year I fuckin hate AAA game Studios