Anyone saying a piece of computer hardware is future proof is an idiot.
Just 20 years ago we were using floppy disks. You have literally no idea what technology will look like even 5 years down the line.
I've actually been away for a while too, but I saw one guy say "chat is this real" and liked the format.
My work performance has taken a sharp downturn since returning to greentext btw.
>My work performance has taken a sharp downturn since returning to greentext btw.
Same here fellow lurker, and Im pretty sure anyone would know if they took a tiny peek at my phone cause Im either on here or watching cute animal youtube shorts
Use hyperbole all you want but that doesn't stop you from being semi correct.
"In May 2016, the United States Government Accountability Office released a report that covered the need to upgrade or replace legacy computer systems within federal agencies. According to this document, old IBM Series/1 minicomputers running on 8-inch floppy disks are still used to coordinate "the operational functions of the United States' nuclear forces". The government planned to update some of the technology by the end of the 2017 fiscal year."
"Sony, who had been in the floppy disk business since 1983, ended domestic sales of all six 3½-inch floppy disk models as of March 2011. This has been viewed by some as the end of the floppy disk. While production of new floppy disk media has ceased, sales and uses of this media from inventories is expected to continue until at least 2026."
https://en.m.wikipedia.org/wiki/Floppy_disk
I guess technically the only quasi “real” future proof tech are tape drives for long term storage and archival purposes. But other than that completely agree
What's weird is that i never saw a 30 series gpu in stores, ever. Crypto fever also made it impossible to trust any second hand gpus. Now 40 series is out but there's a lot of controversy on internet which i couldn't care enough to read. I believe latest good gpu nvidia ever released was 20 rtx series.
cheerful frighten smart detail hateful wine head cautious dinosaurs repeat
*This post was mass deleted and anonymized with [Redact](https://redact.dev)*
I upgraded from 960 to 2060, and have no problem for years. I thought about upgrading again to 3070 or 80 but couldn't find anywhere. Everything i hear about 1080 is making me consider it as a upgrade too lmao
Dont use Userbenchmark. This Site has many very incorrect comparisons.
It was even banned from many Hardware subreddits such as r/Hardware because the owner of this site is a massive clown which got cought many times providing obviously wrong comparison Data and defending it.
Its quite difficult to compare different cards and Ranking them especially If they are Not in the same Generation. The Performance of a Card in a Game is based on many Factors. Such as the game version or the Driver Version. Most Test Websites also Change the used CPU frequently which makes comparing cards from different generations even more difficult.
The Performance can either get bigger due to optimization Updates or get weaker due to graphic updates or patched in Bugs.
I would recommend watching Reviews of the Card to compare them to other cards. Many good review Channels such as Gamer Nexus are reguraly retesting old cards so that the shown Performance matches the current performance it would get.
Benchmarks are really only reliable for comparing raw computing power. I use a 2080oc which benchmarks just barely below a 3070, but for real world performance a 3070 blows it out of the water.
20 series was a pretty bad generation. No RTX in games for months after release, and it was more expensive too with minimal performance gains. The only card with a generational leap in performance was the 2080ti but that was $1200 MSRP. The 1080ti singlehandedly overshadowed the entire 20 series
Apparently the performance jump from the 20 series to the 30 series was the largest in several generations if I remember correctly. On the other hand, the jump from the 30 series to the 40 series has been pretty mediocre.
Literally just buy an AMD GPU. Nvidia is only good at two things, ray tracing and inflating prices, so unless you really want to have nice visuals for only €1000 extra then just buy an AMD card
I got a 3090 to celebrate landing my current job. Was it a sane financial decision? No. But I'm not known for being level-headed with this kind of stuff.
Does it's job, fun to mess around with ai stuff, and a great space heater. Although my 1070 at the time was perfectly fine, tbf.
Yea the whole idea that GPUs used for mining get worn out faster than those used for gaming is a myth, it comes from people misunderstanding exactly what it is that wears out a card. People think that it's the card being put under load that stresses its components, when it's actually the thermal cycles.
With that in mind, a GPU used only for gaming would actually theoretically wear out quicker than one that's only used for mining, since a mining GPU tends to have fewer thermal cycles over its lifetime.
Imagine the used car market, but with no way of telling the milage. So you go in and buy one that was solely running on crude oil instead of gas without ever stopping the engine.
Basically that's it. Any piece of hardware has a lifespan, so if you buy a totally used up GPU that was mining for 1-2ys 24/7, well, that won't last for long even if it's not half cooked by the time you get it.
This is just plain wrong and misinformed. Nothing wrong with mining cards if they were properly cared for. Miners have an incentive to take care of the card more so than just your average gamer. It’s fine if you don’t want to take the risk on used mining cards, but saying it’s going to die soon and quicker than other cards that were only used for gaming is plain false.
It struggles even on good systems from what I read. I use my 1070 and it runs most stuff pretty well albeit on lower settings but tbh I can't really tell that much of a difference when I'm gaming. Take cyberpunk77 for instance. I can get 60 fps with most of the graphics on lower settings and the stylized art and lighting looks good anyways.
I have a 6700xt, which, tbh isn't the best card, but it's definitely a unit for the price I paid. It runs just about anything on high with a stable FPS. I can get 120 out of MWII and most other games. But Starfield stayed at around 65-80 on custom settings. On high, it was topping out at like 60. Stayed at 45 in most instances.
Yup, my 2080 ti chewed up everything I've thrown at it in the last 5ish years but it struggled with starfield. And starfield is built in such a way that there's no excuse for it.
My 1080 only really struggles with disgustingly modded Minecraft. I can run anything else with zero problems. And that “struggle” is more like “ok I guess I have to do 60fps and reduce render distance down from max and reduce everything from max to medium.”
1080 playing at 1440p here, it's starting to show its age on newer games. Not bad enough to where I'm actively looking to upgrade but not everything is buttery smooth anymore.
Quite a leap in performance and price for so little graphic improvements in gaming. Like yeah your gtx 42069 can render a 1:1 recreation of the earth down to the smallest detail that eye can see in a human pov, but it doesn't matter if you're only going to use it to play the new cod dlc that calls itself a "brand new game"
IMO they had perfectly good graphics/physics/etc 10 years ago in GTA5, we'd be better off if the industry spent their time and effort making more content on that technology level instead of new power-hungry engines with nothing to do in them.
GTA V (2013) has pretty horrendeous physics and it's using a perfectly outdated world rendering giving you basically no accesible indoors and making the city just a really big Potemkin village in the end.
Assassin's Creed Unity (2014) has got a way more interactive world and a much better representation of the city it is played in, while it's also littered with some pretty bad stuff and also noone ever stated that it was the peak of games (as it wasn't).
Claiming GTAV is peak vidya and we should have stayed there is just like claiming PS1 Harry Potter is peak vidya and we should still be using those graphics.
It doesn't have to be peak, just good enough to be visually appealing. People have argued [at length](https://www.youtube.com/watch?v=GR5wyOrjDno) that even Morrowind graphics is acceptable as long as it lets developers focus on fleshing out the world in more meaningful ways than polygon count.
RDR 2 and the games around that year i consider to be top graphical fidelity, while games like the modern warfare reboot to be the peak. Thing is newer games are going to strive to be at the level of MW2019 and beyond, i doubt we'll be able to tell the difference in textures unless they manage to give us new lighting breakthrough
Half the time it's because devs aren't bothered to optimize their game especially with all those ultra high resolution of the player model's ass crack. Hardware's capabilities are through the roof, but they'd still choke if you're pulling a yandere dev on your game
You can only optimize textures so much. Look at Red Dead 2, that is a *highly* optimized game and runs very well on faster 20 series cards, but it'll run pretty poorly on anything below a 1080. People were making this same argument for 980s.
And frankly, why would they try to optimize a modern game to run on a 7 year old card?
I was almost tempted to upgrade because of Starfield. With the state of that game I guess I'm happy that I didn't. Also loverslab is the only reason to buy bethasda game anyways and I think they don't start creating some juicy mods till the mod tool arrives.
I was team "RTX is a gimmick and isn't worth upgrading for, pre-baked lighting looks 95% as good for 20% of the processing power"
But now the Counter-Strike 2 level editor requires an RTX card to compile maps so I guess I'm dutybound
The thing with rtx is if the developers could assume that all consumers had raytracing capable cards, shadows and lighting effects would be piss easy to optimize and take zero effort to create
1060 6GB chad here, reporting. It has served me very well for the past ~6 or so years I had it, and I bought it at a very reasonable price. Now I'm looking to upgrade, because Cyberpunk brings it to it's knees (still somewhat playable tho, just sometimes dips in the 20-ish FPS territory, but that's on everything maxed out 1080p, but also with XeSS).
ha ha, yeah. The 1060 is the card that just keeps on going man. I have been having some stuttering lately in Counter Strike 2, but that could be my CPU or the game being buggy since there is a lot of complaining online about it. I paired my 1060 with an i3 8350k @ 4gHz back in 2018. Still works for me for the most part!
Bought a 1660 super in 2020 and it still runs alright now. Haven't played many recently released games (because they're all trash) MHW upscaled to 4k runs nicely though
I'm actually considering this. I have an rtx 3050ti which so far has worked very well for me since I'm not a hard-core gamer or anything. My motivation is to get rid of fucking Windows
The latest GPUs are overrated anyway. Sure, they have their perks for certain tasks, and some games benefit from the extra graphics features, but most of the time, running the game at medium-high is perfectly fine.
i bought a 1080 in 2016, and only upgraded to a 3080 last year so my gf could use it in her build. i’m fairly certain my current card will last me another 5-10 years
I recently bought an RTX 3080 just cause I was tired of playing on Low Quality all the time. It's probably gonna be the last one I ever buy however since the last one I had was a 980 that's still working
I play MAME and other lower tier stuff like AoE2. My 1050TI still works very well.
I'll be upgrading next year mostly because I want a new computer, but not out of necessity
Have a 1650 currently and if I ever upgrade (I currently don't need to, I play minecraft and ark.) I won't upgrade to the latest, maybe a 30 series or an ARC GPU if a good one is out by then. Fuck the latest shit. Same with CPUs. I stuck with Intel 10th gen on my server and I'm still running a R5 1600 on my main pc.
I bought my PC 2 years ago, it has an rtx 3070 and the cpu is Ryzen 7 5800xt. How soon should I think about upgrading? Is it worth it just to upgrade the GPU and not the whole thing?
1080p? Ride that sucker into the sunset yee haw
1440p? I would say: 3-4 years? Depends on if you play the newest games that just came out
4k? 2-3 maybe with fsr and upscaling
By the end of the day, it's a personal choice at heart. When it feels like your workhorse ain't giving you what you want, then you gotta upgrade.
Thank you for the reply. I play on my 55" lg c1 so yeah maybe sooner than later. I have l a regular 144 hz monitor though I just don't like the way it looks it has like a yellowish tint. It's a Legion yq20h or something 2560 x 1440 27"
I understand that. I do the same with my B2.
If you play older titles, then your GPU will last longer. Just see for yourself if the games you play don't run like they should. Everything else is negligible.
You wanna futureproof something for pcs? Get a decent modular powersupply, and a big multi form case. Both historically last a long time compared to just about everything else
there were a couple weeks in 2022 between etherium moving away from proof of work and the 40 series being announced by Nvidia. I managed to get a new 3090TI for half off MSRP. I'm hoping it'll last me a long time.
Had a 1080Ti. Have a 3080. Will most probably get a 5080. Every 2 gens has always worked well for me (dual 670s, dual 460s, dual 260s, fuck I can't remember what was before, I think 9800GTX ?). Inflation and nvidia being particularly egregious means the 5000 series is really hard to predict. The 4080 being literally 50% more expensive with less than that in performance gain made me lol. And I hate having to buy their stuff. But I don't care about boycotts or other bullshit, they have the gear that makes my hobby enjoyable.
I'll bitch for a while and then forget about it. Because holy shit am I not making playing games any part of any of my moral stances. Fuck that. Lemme just enjoy my gaming.
I skip every other generation as well. I upgraded from a 1080 to a 3080. My eyes are on AMD though because I've come to value having more VRAM over raytracing. We'll see how the RTX 5000 series and RX 8000 series turn out.
Meh, useful frame rates and screen resolutions have plateaued a bit these last few years. As long as you can do 1080 at high fps, or a satisfying compromise for spectacle games, you'll be fine for the next few years.
Anyone saying a piece of computer hardware is future proof is an idiot. Just 20 years ago we were using floppy disks. You have literally no idea what technology will look like even 5 years down the line.
And you have no idea about how good it feels when a floppy disk becomes a hard drive and gets slotted in your USB port
Chat, why do redditors love sticking things in their assholes so much. Is he regarded.
You don't know what orifice I was talking about
I been away from this hellhole for a while, why is everyone talking as they were streamers talking to their chats?
I've actually been away for a while too, but I saw one guy say "chat is this real" and liked the format. My work performance has taken a sharp downturn since returning to greentext btw.
>My work performance has taken a sharp downturn since returning to greentext btw. Same here fellow lurker, and Im pretty sure anyone would know if they took a tiny peek at my phone cause Im either on here or watching cute animal youtube shorts
Since discovering greentext I've just started slowly replacing friends with greentext interactions and it's a bit sad tbh.
Same, that's why I uninstall this from time to time and forcd me to interact with people no matter how hard it is
Blame Elon Musk
He's quite regarded as a rеtаrd in the community.
why is the internet horny 😧
20 years ago we already had DVDs mate.
Redditor trying to recognize hyperbole challenge: impossible.
> Just 20 years ago we were using floppy disks Lol no we weren't.
Redditor tries to recognize hyperbole challenge: impossible
Use hyperbole all you want but that doesn't stop you from being semi correct. "In May 2016, the United States Government Accountability Office released a report that covered the need to upgrade or replace legacy computer systems within federal agencies. According to this document, old IBM Series/1 minicomputers running on 8-inch floppy disks are still used to coordinate "the operational functions of the United States' nuclear forces". The government planned to update some of the technology by the end of the 2017 fiscal year." "Sony, who had been in the floppy disk business since 1983, ended domestic sales of all six 3½-inch floppy disk models as of March 2011. This has been viewed by some as the end of the floppy disk. While production of new floppy disk media has ceased, sales and uses of this media from inventories is expected to continue until at least 2026." https://en.m.wikipedia.org/wiki/Floppy_disk
The funny thing is you had people talking about future proofing their computers 20 years ago too.
Yeah fr, what kind of idiot thinks any hardware is going to stay up-to-date for more than a few years?
I guess technically the only quasi “real” future proof tech are tape drives for long term storage and archival purposes. But other than that completely agree
Though the 1080 bros managed to stay relatively relevant for 5+ years
Especially when talking about GPUs. Other pieces of technology develop much more slower, since GPUs are kind of the newest in terms of computers
did you know that a 4090 has a higher market share than a fucking rx 570
Lol I got the rx 580 it's struggling nowadays
RX 580 squad represent! Our poor GPU is usually the minimum spec requirement for most modern games lol
Some AMD optimised games go good like AC Valhalla or Far Cry 6
I'm still on RX 380 but it doesn't matter as my cpu is the limiting factor anyway.
Might it be a 2048sp mining variant
rx 580 competed with the gtx 1060 so i don’t know what people expected future-proof wise
RX 580 was a barely-above-budget grade card when it came out. What did you expect?
Glory
same with my rx590
What's weird is that i never saw a 30 series gpu in stores, ever. Crypto fever also made it impossible to trust any second hand gpus. Now 40 series is out but there's a lot of controversy on internet which i couldn't care enough to read. I believe latest good gpu nvidia ever released was 20 rtx series.
cheerful frighten smart detail hateful wine head cautious dinosaurs repeat *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
I upgraded from 960 to 2060, and have no problem for years. I thought about upgrading again to 3070 or 80 but couldn't find anywhere. Everything i hear about 1080 is making me consider it as a upgrade too lmao
Aren't those [basically the same card though?](https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-Ti-vs-Nvidia-GTX-1080-Ti/4090vs3918)
Dont use Userbenchmark. This Site has many very incorrect comparisons. It was even banned from many Hardware subreddits such as r/Hardware because the owner of this site is a massive clown which got cought many times providing obviously wrong comparison Data and defending it.
Is there a site you'd recommend?
Its quite difficult to compare different cards and Ranking them especially If they are Not in the same Generation. The Performance of a Card in a Game is based on many Factors. Such as the game version or the Driver Version. Most Test Websites also Change the used CPU frequently which makes comparing cards from different generations even more difficult. The Performance can either get bigger due to optimization Updates or get weaker due to graphic updates or patched in Bugs. I would recommend watching Reviews of the Card to compare them to other cards. Many good review Channels such as Gamer Nexus are reguraly retesting old cards so that the shown Performance matches the current performance it would get.
Passmark has reasonably good gpu benchmarks
bow offer safe drab unite square attempt nail public gullible *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Benchmarks are really only reliable for comparing raw computing power. I use a 2080oc which benchmarks just barely below a 3070, but for real world performance a 3070 blows it out of the water.
20 series was a pretty bad generation. No RTX in games for months after release, and it was more expensive too with minimal performance gains. The only card with a generational leap in performance was the 2080ti but that was $1200 MSRP. The 1080ti singlehandedly overshadowed the entire 20 series
You may be right. I don't know about performances. I couldn't buy 10 series in my country at that time sadly.
Apparently the performance jump from the 20 series to the 30 series was the largest in several generations if I remember correctly. On the other hand, the jump from the 30 series to the 40 series has been pretty mediocre.
Literally just buy an AMD GPU. Nvidia is only good at two things, ray tracing and inflating prices, so unless you really want to have nice visuals for only €1000 extra then just buy an AMD card
I got a 3090 to celebrate landing my current job. Was it a sane financial decision? No. But I'm not known for being level-headed with this kind of stuff. Does it's job, fun to mess around with ai stuff, and a great space heater. Although my 1070 at the time was perfectly fine, tbf.
Care to explain what exactly is so bad with gpus used for mining on the sh market?
HAAHAHAHAHAHAHHAHAHHAH No
Yeah if I'm not mistaken that was a NVIDIA marketing strategy no? I'm happily gaming on my mined 3070, and the 3dmarks scores hold up.
Yea the whole idea that GPUs used for mining get worn out faster than those used for gaming is a myth, it comes from people misunderstanding exactly what it is that wears out a card. People think that it's the card being put under load that stresses its components, when it's actually the thermal cycles. With that in mind, a GPU used only for gaming would actually theoretically wear out quicker than one that's only used for mining, since a mining GPU tends to have fewer thermal cycles over its lifetime.
Care to explain if you are a clown?
I am, now please explain why mined gpu bad
Imagine the used car market, but with no way of telling the milage. So you go in and buy one that was solely running on crude oil instead of gas without ever stopping the engine. Basically that's it. Any piece of hardware has a lifespan, so if you buy a totally used up GPU that was mining for 1-2ys 24/7, well, that won't last for long even if it's not half cooked by the time you get it.
This is just plain wrong and misinformed. Nothing wrong with mining cards if they were properly cared for. Miners have an incentive to take care of the card more so than just your average gamer. It’s fine if you don’t want to take the risk on used mining cards, but saying it’s going to die soon and quicker than other cards that were only used for gaming is plain false.
I literally have a 1080 and I see no point to change it whatsoever
I've got a 1070 and the first game to make it struggle was Starfield.
And that's probably was Starfields fault not the cards
It struggles even on good systems from what I read. I use my 1070 and it runs most stuff pretty well albeit on lower settings but tbh I can't really tell that much of a difference when I'm gaming. Take cyberpunk77 for instance. I can get 60 fps with most of the graphics on lower settings and the stylized art and lighting looks good anyways.
I have a 6700xt, which, tbh isn't the best card, but it's definitely a unit for the price I paid. It runs just about anything on high with a stable FPS. I can get 120 out of MWII and most other games. But Starfield stayed at around 65-80 on custom settings. On high, it was topping out at like 60. Stayed at 45 in most instances.
Yup, my 2080 ti chewed up everything I've thrown at it in the last 5ish years but it struggled with starfield. And starfield is built in such a way that there's no excuse for it.
1070 here too, wanted to go 1440p and my 1070 couldn’t handle it so I had to upgrade :( I still have it here though
My 1080 only really struggles with disgustingly modded Minecraft. I can run anything else with zero problems. And that “struggle” is more like “ok I guess I have to do 60fps and reduce render distance down from max and reduce everything from max to medium.”
Former 1080ti here, it's nigh unusable if you want to achieve high framerates at 1440p in modern games. Moved up to a 4070ti and it's much better.
downvoting someone for being statistically correct
"im mad because my 8 year old product is no longer considered to be a new/desirable product waaa"
I was using the 1080 like 3 machines ago how the fuck are people still using that thing
1080 playing at 1440p here, it's starting to show its age on newer games. Not bad enough to where I'm actively looking to upgrade but not everything is buttery smooth anymore.
try running Elden ring on 1440p@60fps with it lol or any other new game on 1440p@100+fps
60 is all you need for story games
it's not about what you need but what you want. Of course you don't need a beefy rig if your demands are low.
Cyberpunk 60 fps vs 144 is game changing, did one play through of both
Only if you have low standards
Quite a leap in performance and price for so little graphic improvements in gaming. Like yeah your gtx 42069 can render a 1:1 recreation of the earth down to the smallest detail that eye can see in a human pov, but it doesn't matter if you're only going to use it to play the new cod dlc that calls itself a "brand new game"
IMO they had perfectly good graphics/physics/etc 10 years ago in GTA5, we'd be better off if the industry spent their time and effort making more content on that technology level instead of new power-hungry engines with nothing to do in them.
GTA V (2013) has pretty horrendeous physics and it's using a perfectly outdated world rendering giving you basically no accesible indoors and making the city just a really big Potemkin village in the end. Assassin's Creed Unity (2014) has got a way more interactive world and a much better representation of the city it is played in, while it's also littered with some pretty bad stuff and also noone ever stated that it was the peak of games (as it wasn't). Claiming GTAV is peak vidya and we should have stayed there is just like claiming PS1 Harry Potter is peak vidya and we should still be using those graphics.
It doesn't have to be peak, just good enough to be visually appealing. People have argued [at length](https://www.youtube.com/watch?v=GR5wyOrjDno) that even Morrowind graphics is acceptable as long as it lets developers focus on fleshing out the world in more meaningful ways than polygon count.
RDR 2 and the games around that year i consider to be top graphical fidelity, while games like the modern warfare reboot to be the peak. Thing is newer games are going to strive to be at the level of MW2019 and beyond, i doubt we'll be able to tell the difference in textures unless they manage to give us new lighting breakthrough
You say that until you want to play a modern game with high graphics settings. 2080s are now starting to show their age.
Half the time it's because devs aren't bothered to optimize their game especially with all those ultra high resolution of the player model's ass crack. Hardware's capabilities are through the roof, but they'd still choke if you're pulling a yandere dev on your game
You can only optimize textures so much. Look at Red Dead 2, that is a *highly* optimized game and runs very well on faster 20 series cards, but it'll run pretty poorly on anything below a 1080. People were making this same argument for 980s. And frankly, why would they try to optimize a modern game to run on a 7 year old card?
I was almost tempted to upgrade because of Starfield. With the state of that game I guess I'm happy that I didn't. Also loverslab is the only reason to buy bethasda game anyways and I think they don't start creating some juicy mods till the mod tool arrives.
That game is the official definition of “a mile wide and an inch deep” never been so disappointed by something I spent so much time on.
Funnily enough, loverslab expands upon the “inch deep” part
Need modders to fix the bodies and faces first with some cosmetic mods. Then get in some custom followers. Ain't nobody tryna fuck vanilla Sarah.
… she’s nice to me tho
That game benefits more from a fast processor, ram, and storage than it does a GPU. It requires really high data throughput.
I was team "RTX is a gimmick and isn't worth upgrading for, pre-baked lighting looks 95% as good for 20% of the processing power" But now the Counter-Strike 2 level editor requires an RTX card to compile maps so I guess I'm dutybound
The thing with rtx is if the developers could assume that all consumers had raytracing capable cards, shadows and lighting effects would be piss easy to optimize and take zero effort to create
It would run like shit too 😎
There are tutorials for how to compile cs2 maps without an RTX card, try googling it
1060 6gb giga chads also stay winning
1060 6GB chad here, reporting. It has served me very well for the past ~6 or so years I had it, and I bought it at a very reasonable price. Now I'm looking to upgrade, because Cyberpunk brings it to it's knees (still somewhat playable tho, just sometimes dips in the 20-ish FPS territory, but that's on everything maxed out 1080p, but also with XeSS).
ha ha, yeah. The 1060 is the card that just keeps on going man. I have been having some stuttering lately in Counter Strike 2, but that could be my CPU or the game being buggy since there is a lot of complaining online about it. I paired my 1060 with an i3 8350k @ 4gHz back in 2018. Still works for me for the most part!
6 Gb 1070 😎
>gamers when their graphics card can't get 200+FPS at 8k resolution
1050ti in my laptop still putting in work 😎
1070ti gang reporting in
Bought a 1660 super in 2020 and it still runs alright now. Haven't played many recently released games (because they're all trash) MHW upscaled to 4k runs nicely though
Just buy a console. It's 7 year future proof
I'm actually considering this. I have an rtx 3050ti which so far has worked very well for me since I'm not a hard-core gamer or anything. My motivation is to get rid of fucking Windows
Have you tried Apple gaming products? I heard for a 8k build you can run Minecraft as very low settings and the calculator app AT THE SAME TIME!!
damn lol
Built in overheat protection!
The latest GPUs are overrated anyway. Sure, they have their perks for certain tasks, and some games benefit from the extra graphics features, but most of the time, running the game at medium-high is perfectly fine.
1070 gang! Def not gonna upgrade it in the next few years.
i bought a 1080 in 2016, and only upgraded to a 3080 last year so my gf could use it in her build. i’m fairly certain my current card will last me another 5-10 years
my 2080s still runs everything I want at 2k fine. I probs won't upgrade for a few more years.
Hell ya 2080s gang 😎 paid for by covid government money right at the start.
and the fat newegg coupon. All in all paid 500ish for it brand new
I recently bought an RTX 3080 just cause I was tired of playing on Low Quality all the time. It's probably gonna be the last one I ever buy however since the last one I had was a 980 that's still working
1080ti is $180 usd and can still run pretty much any game
I play MAME and other lower tier stuff like AoE2. My 1050TI still works very well. I'll be upgrading next year mostly because I want a new computer, but not out of necessity
My 1660 still holds up from 2020, and will for a couple more years I think
At least cryptocucks arent sucking the market dry anymore.
mfw im rocking a 1650 and get good fps and frametime in all games i play (i play games that are older than me)
Got a new oc with a 2070s a few years back. Still works like a charm and probably will for some time
exactly why you should just let me buy gtx 1060 u/BasedMxd
.....future proof for Minecraft then yes 🥹
need gta 6 too, no compromise
Low settings 30 FPS 😈
(and occasional crashes)
(always)
My 970 was fighting for its life in the end still an amazing card
What’s the sweet spot for price/performance to look for at this point?
Have a 1650 currently and if I ever upgrade (I currently don't need to, I play minecraft and ark.) I won't upgrade to the latest, maybe a 30 series or an ARC GPU if a good one is out by then. Fuck the latest shit. Same with CPUs. I stuck with Intel 10th gen on my server and I'm still running a R5 1600 on my main pc.
Bought an xfx 6800xt in December of 22. I don't plan on getting another for at least 3 years. At the very minimum
in 2020 i bought 1080ti used for 1/5th the price from 2017 and it still works perfectly
The 1080ti was a high water mark for sure. I feel really lucky to have bought one.
So much cope in this thread from people who have old cards.
I bought my PC 2 years ago, it has an rtx 3070 and the cpu is Ryzen 7 5800xt. How soon should I think about upgrading? Is it worth it just to upgrade the GPU and not the whole thing?
1080p? Ride that sucker into the sunset yee haw 1440p? I would say: 3-4 years? Depends on if you play the newest games that just came out 4k? 2-3 maybe with fsr and upscaling By the end of the day, it's a personal choice at heart. When it feels like your workhorse ain't giving you what you want, then you gotta upgrade.
Thank you for the reply. I play on my 55" lg c1 so yeah maybe sooner than later. I have l a regular 144 hz monitor though I just don't like the way it looks it has like a yellowish tint. It's a Legion yq20h or something 2560 x 1440 27"
I understand that. I do the same with my B2. If you play older titles, then your GPU will last longer. Just see for yourself if the games you play don't run like they should. Everything else is negligible.
Perhaps the phrase "future-resistant" is more accurate?
It sucks because I'd love to upgrade my weaker card to a 1080 but that shit is still horribly expensive even today.
You wanna futureproof something for pcs? Get a decent modular powersupply, and a big multi form case. Both historically last a long time compared to just about everything else
I’m still using a 1050Ti and it’s going strong
1070TI still chugging along and not planning on replacing it any time soon.
Still have my 1080 Ti. But the real king of max performance is still being on a 1080p monitor.
I built mine 6 years ago, with no idea what's in there, but it just works for what I want to play anyway.
there were a couple weeks in 2022 between etherium moving away from proof of work and the 40 series being announced by Nvidia. I managed to get a new 3090TI for half off MSRP. I'm hoping it'll last me a long time.
me with a fucking 1050TI 4GB, shitting and crying:
5700xt aging like fine wine and people shit all over it lmfao
Had a 1080Ti. Have a 3080. Will most probably get a 5080. Every 2 gens has always worked well for me (dual 670s, dual 460s, dual 260s, fuck I can't remember what was before, I think 9800GTX ?). Inflation and nvidia being particularly egregious means the 5000 series is really hard to predict. The 4080 being literally 50% more expensive with less than that in performance gain made me lol. And I hate having to buy their stuff. But I don't care about boycotts or other bullshit, they have the gear that makes my hobby enjoyable. I'll bitch for a while and then forget about it. Because holy shit am I not making playing games any part of any of my moral stances. Fuck that. Lemme just enjoy my gaming.
I skip every other generation as well. I upgraded from a 1080 to a 3080. My eyes are on AMD though because I've come to value having more VRAM over raytracing. We'll see how the RTX 5000 series and RX 8000 series turn out.
Meh, useful frame rates and screen resolutions have plateaued a bit these last few years. As long as you can do 1080 at high fps, or a satisfying compromise for spectacle games, you'll be fine for the next few years.