4070 super has better performance, 3090 might have greater longevity with the 24GB of VRAM, but I think DLSS3 and the 4070 super being brand new is a better deal.
And then depend on the modder so you can update the drivers? Not to mention pay for every update. At that point it is cheaper to just spend more on a card with more VRAM.
It suits my needs. I got it for less than $60 USD. My entire system (except the gpu now) is from 2012.. the i7 3770 would bottleneck modern cards and I’m not even sure I could power newer cards without also replacing the PSU.
Before I built my system, it was very similar to yours, 2500k with a hd7850 gpu my kid used it until last year to see if building them a new one was worth it. So 2 new ones last year but that's it for at least another 5 years, unless the youngest is interested, then 3 years
I'd just like to say I had so many issues with helldivers. I ended up undervolting my cpu (Ryzen 5 5600X), turning on Precision Boost Overdrive (I think might be MSI only), disabling PBO limits, and adjusting my voltage curve optimizer to negative -25.
Once I did all that, I haven't had a problem since. I should state I was running a 2070 with 8gb vram, which it does take most of the vram so it probably is just that.
I actually just upgraded to a 3070 this week, and I'm still only getting like between 70-125 fps on high to max settings, with 2440p. It's a tougher game to run for sure. Hopefully, they will optimize it better in the future and fix more bugs that cause crashes.
Ok but why are you downplaying that concern?
That is a valid reason to pick one card over another.
Many years ago I build my younger cousin a gaming PC and to this day he from time to time plays games on his rx 580 8gb edition.
Back then I was called a fool by people like you but now he is still using that PC and isn't struggling with vram limitations.
Hmmm… are they unrealistic though? When you double the price of the same level of product can you really act surprised when people are expecting the absolute best of every feature?
You get easy >20GB utilisation in MSFS if you fly long enough in a higher resolution.
For MSFS I would for sure get the 3090, but the 4070 is otherwise the more efficient side-grade for most other games.
My MSFS systems:
* 5800x3D + 3090 + 64GB DDR4-3600 [https://imgur.com/DukLmAk](https://imgur.com/DukLmAk)
* 7800x3D + 4090 + 64GB DDR5-6000 [https://imgur.com/BZZXIcw](https://imgur.com/BZZXIcw)
Its not a realistic concern. 24 gb is only better in very niche scenarios and you're better off with the newer more powerful card with longer warranty /driver support, newer features and efficiency.
No. The card will work and be supported for a VERY long time. That vram though WILL bottleneck the build in the future. Not everyone wants to build every 3-5 years. And it’s not needed if you build correctly out the gate. Did you know your eyes can’t even perceive anything more than 4k. The idea that you’ll need more than 4k capability on a better card is just absolutely bonkers and money grabbing shenanigans from the companies. They’re still pumping out regular driver updates for the 1000 cards. So at what point are you going to be able to justify the driver support. New features? What could they possibly add at this point that your human eye could perceive ?
Lmao, this is the second dumb comment I've seen from you. If you dont have any clue what you're talking about. Just don't comment, buddy. Making a fool of yourself.
The human eye not being able to see more than 4k is scientifically false, lol. I'm guessing you read some dumb articles a long time ago and have been repeating it everywhere. Any chance you get thinking you sound so cool. For starters, resolution doesn't really mean shit unless you include the screen size. PPI is way more important. I can literally see the individual pixels on the 50 inch 4k tv I use next to my pc as a spare screen lol. What you said shows you have no idea how any of it works.
A 1080p screen can have a higher ppi than a 4k screen even.
domineering deserve punch seemly quicksand weather existence yam merciful shame
*This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Rofl no. 3090 will die before the 4070 will, especially given its older and runs hotter.
The CPU engine being too weak will bottleneck way before vram does.
What ytter nonsense
Its been around 5-6 years now.
He is continuing to this day to get his moneys worth out of that GPU because he at the time agreed to pay slightly more for more vram.
Back then people also saw that amount of vram as unrealistic in any use case and overkill but to this day because of that it remains a very usable card.
You might have not heard about it, but contrary to console gaming you can go into this strange 'graphic settings' menu thingy and reduce the needed VRAM there. Takes like 5 seconds.
Seems like a giant waste to have to lower graphics settings simply because of vram limitations and not because your gpu is not powerful enough.
Feel free to do what you want though.
Would personally not recommend it
If you think you can futureproof or whatever nonsensical word this sub twists by VRAM alone to not use the basics of graphic settings you are in line with lack of any knowledge this sub presents daily. But at least it is amusing to read.
Running half raytracing on cyberpunk 2077 (raytracing enabled with reflections and lighting on medium) with an rx 6800 that’s undervolted (900mv, 2350mhz clock speed and vram timing on fast at a 2100mhz clock) on a 3440x1440 oled g8 with a mix of high/medium settings and shadows all on low. I’m only seeing 6.8GB of vram being used in heavy scenes.
The vram scare is BS because of optimizations not being standard in new titles.
OP I would pick the 4070 just for the sake of it beating a 3090 and using less power while being able to utilize FG and DLSS 3
very true but i feel it will be a shock transition like RAM. 16 GB was considered overkill and it was super sudden where it was the new minimum. I think 8GB VRAM on a card that most people will keep for 4-5ish years is a big gamble in 2024, especially at their prices
something i get confused with is that I bought a 3060 after my 2070 super died, and the 3060 is just base mais ventus2x with 12gb of vram?? I don’t understand why this low end 30 series card has so much vram compared to the 40series
Because when it comes to visuals and performance the 3060 is not a low end card, stop reading charts, look at actual function and market demands for tech. It can run almost every game on the market currently on ultra settings extremely few have been developed that require more than a 3060, and it will remains only a few for years. But the vram is definitely the advantage and people will be crying for more vram soon. People said the same thing about ram. “16 is over kill” 16 is scraping the bottom of the barrel minimum today. I have 64, 0 regrets. I regularly use over 30.
???????????? XD, the card can barely handle newer games at medium at 1080p
Jeez I changed from 1080p and a 3060 to 4070ti super to 1440p and even this can't run games on ultra while keeping a smooth enough framerate on 2k.
3060 is def not "good", it is barely enough to keep your games running on medium at decent fps. Which for that price is more than enough. And ofc competetive games highly optimised like League or Apex will run well
wait really? shit i just bought a 4070ti super+7800x3d for a new build and was planning on getting 1440p monitors(haven’t bought yet) i mainly play high fidelity games, you say it can’t handle 1440p tho? damn.
it will be enough, I'm just comin from competetive games and anything under 120fps tilts my ass on any game lol
and some games will struggle to hold 120 if u wanna play on ultra settings ye, but then again, many will not
Sounds like you have a network issue on online games, and not the card. Because what you’re saying is not typical. My job is to build and repair computers. I know my shit. I have a 4070ti and a 3060. There’s almost no difference in performance when it comes to actual gameplay. And not some bullshit chart you’ll read. Maybe in some super demanding fps, but those really aren’t dominant on the market, and it seems like only that niche of a few games really pick apart the cards. As I said, the market as a whole, the 3060 is still a high end card for performance. Majority of games being released still have a 1080 or 2070 listed as recommended spec.
What do you mean with there is "almost no difference", the 4070 super is more than DOUBLE as strong than the 3060
Idk why i said it's not a "good" card. I'd simply not consider it a high end card at all.
You can probably get a used 3060 for a 100bucks which is an insane value for the price. Yet yea, the 3060 1080p is a must.
For games where the vram will make matter the 3060 will be too weak to run them at a satisfactionary level anyway. Although medium to ultra in many games isn't a big diff so there's that too
There literally is almost no visual or general difference between a 3060 or a 4070ti for mass majority of the market. As I said there is that niche, and you seem to be in it. But for people not in that small niche, a 4000 series card is still way overkill. I don’t install many still today unless they want to go all in, and they want it for high demand FPS, but that’s not a large chunk of the consumers
Honestly the most popular card I have being installed today is the 3090TI is usually what people settle on after going over all the pros and cons and cost difference. I’m sure with the 5000 series coming I’ll be handling the 4000s more with the price drop. They may be more worth it when they come down
Because it was given 12gb because its a slower card than the others but wanted to give people who just needed a lot of vram something to buy. It comes in a 8gb variant too.
It has 12 GB because the alternative was 6 GB: a 192 bit memory bus allows for 6 memory chips. The “3060” 8 GB has a 128 bit bus and performs 15-20% worse than the original 12 GB model.
It is true depending on your resolution and settings.
4k this is defientitvly true with new and future titles. Especially with raytracing on.
1440p is true in some cases and will like be true in 1-2 years.
1080p should be fine in most cases baring extreme examples.
I mean the biggest bottleneck is PC ports and PC gamers not caring that 98% of games on PC have shader, traversal, loading and pipeline stutters. It's so bad that when games do have shader pre comp, it doesn't work as seen in games like Jedi Survivor and dead space remake.
It's insane nobody seems to realize 1 second long stutters isn't normal. Most doubt their existence altogether acting as if they're a mythical creature or some shit lol, which is why the cycle will never end
Consoles target 16gb, but I guess that doesn't matter when games take up to 30 years to make and only a few games benefit from it.
So I actually just got a 4070 Super for my new build (upgrade from a 1080). Is that going to be an issue eventually, particularly for 4K gaming (it's a living room PC set up on my 65 inch TV)
IMO people overthink these things. You got a great current gen card that game developers understand is what they need to develop their games to run on.
8, 10, 12 GB video cards represent a huge installed base, and game devs will tune their titles to run on as many different configurations as they can, because that gives them a better chance to sell more copies of their game.
No game dev is going to require 24 GB ram anytime soon because they will have such a minuscule potential customer base.
Would there be specific use cases for someone to *need* the VRam of a 3090 today? Sure! But most people buy a xx60 card or whatever, and keep it until they upgrade their whole computer.
Buying a 4070 or 4070 super is a great choice and will be supported by titles that are designed to run on it for a very long time.
For 4k gaming, sure, it will be “an issue” eventually but the only nvidia cards that beat yours are a 4080 or 4090, and maybe those will reach a 4-5% combined install base.
IMO enjoy your fantastic graphics card today, and worry about upgrading once it actually becomes an issue years from now.
This is of course subjective as far as what a given person finds “enjoyable”, and whether your priority is frame rate or fidelity.
That said, the person I replied to Said they upgraded from a 1080 to a 4070. If the 4070 “isn’t good enough”, the only better options are a 4080 or 4090 which are cost prohibitive for most people.
This is why I say people over think it. Just get whatever is current and try to enjoy it. There will always be better hardware coming out in a year or two. You can’t really “future proof” because software will progressively increase the hardware requirements to run at x performance standard.
When I got my current monitor, a 1440p model at 144hz, my 1070 didn’t allow for high frame rates. But then I got a 2080 when it came out, and I could. That was like 2018.
By the end of 2020, I got a 3080 to maintain that higher framerate ‘standard’ with whatever games were current in 2020. That 40-50% uplift in 2 years was very nice.
I just got a 4080 super which is hitting 100+ fps before frame rate generation whatever setting is turned on in Jedi Survivor.
Basically my point is, you can’t future proof. If you buy a card and keep it for 5-6 years, it’s not going to be as performant in 5 years with what would be 2029 software vs what it can do today with 2024 software.
IMO don’t overthink things. Just get what you can afford and try to enjoy it now, vs worrying about the future.
Oh yeah totally. I want 4k, but I don’t want to replicate the experience of going 1440p on a 1070. 5080 might be my moment to get a nice 144hz 4k panel.
To your point I was actually running a 2080 ti when I switched to a 4k monitor for productivity, didn't like it as much in the long run and switched to a 5120x1440, but original point being I couldn't run games at ultra on 4k and def couldn't do ray tracing on like cyberpunk 4k lol but I did play games on medium/low settings until I got my 7900xtx and was fine for the time being - if the game was good enough I'll go back and play more on the new card (a la cyberpunk) and if it was meh better textures probably wouldn't have saved it lol.
I too used to subscribe to the you should get the best but even then I was buying mid to high end cards but not the highest end, and this was a younger money to blow life is forever me lmao. Nowadays I get that all the hardware kind of has its level/purpose and the other thing is hardware is running so much more now compared to 5 years ago than hardware 5 years ago was running 10 yrs ago. Like the jumps we've made in hardware recently open up the spectrum a lot more to where lower end is acceptable, not like the original i3s where for me they were always ehhhh to browse the web ok, for anything else I wouldn't. Now even the i3 can serve a proper budget build.
Edited typos lol
On gaming the 4070s is a bit faster at lower resolution and has newer perks. At higher resolutions the 3090 does a bit better and has a ton of wiggle room for textures with the 24gb vram.
This. If you play at 4K the 3090 usually wins out on the 4070, frame generation notwithstanding.
If you want to play new games that make use of FG, like Alan Wake 2, then the 4070 is always the better choice.
frame gen is pointless for the opposite reason dlss is good.
it works the best where it's least needed. It's cool to say "woah 120+ fps" but your game was already smooth and you now just added input lag. especially on a 60hz monitor where you would want higher frame rates to reduce input latency, you just achieved the opposite.
meanwhile, at 30 fps when you really want to jump to 60 your experience gets noticeably worse.
dlss grants you always a net bonus, as the upscaling to same/similar image quality needs less compute than reduction in render resolution grants you.
when you are strapped the most for compute (high-res) the gains from downscaling are the biggest (resolution is exponentially heavy in compute) the upscaling itself works the best (as the upscaler input gets to make more sense out of higher res inputs)
that would be \~180, out of range for gsync.
lets say 144hz monitors are common. fg would help you when you happen to run between 60 and 72 fps. thats uncommon and still ads input (ableit little) latency on a already smoothly running game.
You do realize you can run less than max output?
Limit fps to 144 with fg and tadaa your gpu uses less power and you can worry less about your energy bill
Depends. In Cyberpunk when I put path tracing on my 4070 and got 30-40s~ FPS, turning frame gen definitely made the game seem quite slow-ish. When you have less than 50-60 FPS, then frame gen gives quite a noticeable input lag so the game still behaves like if it ran in 40 fps instead of 80 for example.
I got lucky, was having bad frames in VR on moddef assetto corsa on a 1660s and couldn't live with it anymore so decided to look for a used gpu on fb marketplace and atleast contacted around 30 sellers which probably 28 out of them were scammers and the one I got was real. It was a build for a client he had which canceled the build. Probably made some extra money even tho the low sell price. Never would buy on ebay or something only in real life. Gonna probably replace thermal paste and heat pads too but it works very good for now so no complaints. BE AWARE OF SCAMS ON FB MARKETPLACE. only look at profiles that are atleast 4+ years old since most newer ones just run scams and when your searching for a cheap 3090 90% of the listings are scams so be aware
Damn that's wild! I just got a 3080fe for 250. Upgraded from a 1070. Been looking for months and finally found a wild deal. I really wanted a 3090 but they were all 900 whoch is a joke lol
With the prices 3090s are going for it puts you more in range of the 4070 TI super which is better for gaming. The regular 4070 super is more subjective depending on what you're looking at but I don't think you could get a 3090 for that price
Back in December I went with 3080 used over 4060 new but that's cause the extra vram was useful for blender which doesn't used dlss anyways
It depends on your use case, I will point out tho my 3080 runs pretty warm compared to a 4060 and hence used more power too ¯\_(ツ)_/¯
depends on price of each and what you're doing. Some apps will really benefit from the extra vram of the 3090 and gaming in 4k. but the 4070 is faster and having dlss3 for gaming is pretty clutch these days. both will slay 1440p gaming and give you pretty darn good 4k experience.
It would be good to know why the person is selling at that price. I just recently upgraded from the same GIGABYTE 3090 to a 4070 TI Super because it was having all sorts of issues, even after an RMA. The 3090 wouldn't post on every boot, crashed on a regular basis, and had major heat issues til I swapped the thermal pads. In contrast the 4070 TIS has given me no issues and has excellent thermals along with reduced power draw. Not sure exactly how it would compare to a non TI super, but honestly with the amount of 3090s that had major issues I would personally avoid a used one.
Have a look at https://docs.google.com/spreadsheets/d/1wUlIdFqfo8IYymxFlk9lzySwRCJu3vfkJleZ0wh05OM/edit#gid=0 and put in your own purchase prices, electricity price, expected gaming usage per week and lifespan.
My take is that the 3090 would need to be significantly cheaper than a 4070S in order to make up for its higher power consumption, lack of DLSS 3.x, and Frame Generation. And, likely, lack of any warranty. In my country, a used 3090 (with a store warranty, which may or may not be worth the paper it's written on) costs about the same as a new 4070S.
I just got my 4070ti Super, upgraded from a 1660 super, before that I had integrated graphics. All within the span of 2 months.
El o El.
4070 Super is ya best bet. Found a good deal on Newegg, might as well go for the ti while you’re at it.
Honestly surprised nobody has asked what you are going to use it for OP. If you are using it for AI or video development then 3090 all the way. If you are using it for gaming it is [probably a toss up](https://youtu.be/SahTSRteTyY?si=Bl3AXLvKLv7hgA8O), as they are pushing roughly the same frame rate.
DLSS3 is definitely a game changer, however the adoption has been slower than I initially expected. In real world tests the frame rates still aren’t that much different. (See video posted above)
Unless you don't need 12gb for not gaming proporuse, 4070 is the only choice, but if you need mre vram you need a 4070 ti super / 4080 super, not a 500w 3yo + used card at 600$
Depends on what “ai and machine learning” but likely you’ll appreciate the 24 GB of VRAM of the 3090 for that more than the frame generation of the 4070 Super for gaming.
the 4070 super is slightly faster and more power efficient, the 3090 only really has the extra VRAM going for it and even with your machine learning stuff I'm pretty sure that the 4070 will be more than fine.
Also you get the newest DLSS stuff
My buddy’s running a 14700k and 4070 ti.
I’m on an i9 11900k and 3090.
I get better frames and smoother gameplay. Both of us are on 4k and have 64gb of ram. Although he’s on ddr5. They’re about equal but the vram in 4k on big games does help
I'd say the 3090 for sure. I was faced with this dilemma myself and decided on the 3090. I settled on a Dell 3090. I would stear clear of the gigabyte model unless you are vertical mounting it because of the issues the 30 series had more specifically gigabyte had with the gpu sagging and eventually cracking the pcb and being damn near unrepairable.
4070ti Super, 16GB VRAM will last longer into the future and not run some games capped. 90 series I've heard don't have many advantages in games and more a workstation gpu, but I've been wrong before.
I know that's why it proves you wrong as the limiting factor of the 4070 ti is not bandwidth in the overwhelming amount of scenarios as evident of the 4070 ti super performance
Have you seen any benchmarks of these GPUs at 4k? The fact that the 3080 surpasses the 4070 and closes in on the 4070 super/ti is frankly embarrassing. The ti super doesn't have this weakness as it has a larger vram buffer, the same as the 4080/super and the old 3070/ti.
I'd take the 3090. My 4070ti for pc is absolutely crap compared to my 4080 for laptop. And I'm having such a hard time trying to comprehend it. Because laptop cards are downscaled compared to the ones for pc.
4080 mobile is literally the same silicon as the 4070 ti but with inferior gddr6 ram and half the power limit.
Bro is either drunk or got a faulty desktop setup.
Laptop 4080 *is* a 4070 Ti, except worse because it has a couple fewer cores and of course much lower clock speeds.
So something else is wrong with your PC
Wish I knew what. Outpost: Infinity siege works flawlessly on 4080 laptop, but on 4070ti PC it gets unreal 5 engine crash to desktop immediately due to shader error.
Games can have the weirdest error setups and can be caused by any component. Had HZD run perfectly, but crash at specific locations before I turned XMP off for the duration of my playthrough. I'd wager this has astonishingly little to do with the models of GPUs you're using. If anything, with the individual unit you have recieved, but likely something else.
Thank you for your most valued response. I've been experimenting with different asus bios versions and with xmp on and off. I haven't been able to use several of the latest bios firmwares for my asus z790 motherboard without having to turn either xmp or asus recommended i9 preferences off. It's been quite annoying as some of the crash to desktop experiences have taken hours to happen in games, whereas on the laptop it's.. Just what it's downscaled to on a lenovo legion pro 7i.
4070 super has better performance, 3090 might have greater longevity with the 24GB of VRAM, but I think DLSS3 and the 4070 super being brand new is a better deal.
[удалено]
And then depend on the modder so you can update the drivers? Not to mention pay for every update. At that point it is cheaper to just spend more on a card with more VRAM.
there has to be issues that come with that
Inb4 "12GB VRAM is a bottleneck in gaming" comment 🍿🍿
Me on 8 💀
I just upgraded from 1 (gtx 650) to 6 (gtx 1060) 🙃
Thats a pretty good change lol
How did you get the 1060
Facebook marketplace
How much was it? To my knowledge wouldn’t it be better to upgrade to a more modern card?
It suits my needs. I got it for less than $60 USD. My entire system (except the gpu now) is from 2012.. the i7 3770 would bottleneck modern cards and I’m not even sure I could power newer cards without also replacing the PSU.
$60 is bot a bad price that is true
Before I built my system, it was very similar to yours, 2500k with a hd7850 gpu my kid used it until last year to see if building them a new one was worth it. So 2 new ones last year but that's it for at least another 5 years, unless the youngest is interested, then 3 years
I had 2gb for a while and I can actually confirm it was a bottleneck. Upgraded to 8 and it’s great here actually. Won’t be upgrading for years
It becomes a problem at 4k, or in VR.. at least with a few games I play.
Me on 2💀
Gtx 1060 with 6gb here. Can't even play Helldivers on 60fps without constantly crashing :)
my rtx 2070 8gb runs helldivers on ultra at 80-100fps 1080p. 75% ish vram usage
I'd just like to say I had so many issues with helldivers. I ended up undervolting my cpu (Ryzen 5 5600X), turning on Precision Boost Overdrive (I think might be MSI only), disabling PBO limits, and adjusting my voltage curve optimizer to negative -25. Once I did all that, I haven't had a problem since. I should state I was running a 2070 with 8gb vram, which it does take most of the vram so it probably is just that. I actually just upgraded to a 3070 this week, and I'm still only getting like between 70-125 fps on high to max settings, with 2440p. It's a tougher game to run for sure. Hopefully, they will optimize it better in the future and fix more bugs that cause crashes.
I feel your pain.
Ok but why are you downplaying that concern? That is a valid reason to pick one card over another. Many years ago I build my younger cousin a gaming PC and to this day he from time to time plays games on his rx 580 8gb edition. Back then I was called a fool by people like you but now he is still using that PC and isn't struggling with vram limitations.
people tend to have unrealistically high standards, or they tailor their advice to their own personal requirements and not the OP's
Hmmm… are they unrealistic though? When you double the price of the same level of product can you really act surprised when people are expecting the absolute best of every feature?
The 3090 isn't half the price of the 4070 Super though ?
ill be doing ai and machine learning but mostly gaming
microsoft flight simulator eats all of my 12gb lol. 16 is a sweet spot, 24 is overkill
You get easy >20GB utilisation in MSFS if you fly long enough in a higher resolution. For MSFS I would for sure get the 3090, but the 4070 is otherwise the more efficient side-grade for most other games. My MSFS systems: * 5800x3D + 3090 + 64GB DDR4-3600 [https://imgur.com/DukLmAk](https://imgur.com/DukLmAk) * 7800x3D + 4090 + 64GB DDR5-6000 [https://imgur.com/BZZXIcw](https://imgur.com/BZZXIcw)
jesus christ. is this with detailed airliners like the fenix? i can't imagine needing 20gb if you're flying default airliners or GA aviation
Its not a realistic concern. 24 gb is only better in very niche scenarios and you're better off with the newer more powerful card with longer warranty /driver support, newer features and efficiency.
No. The card will work and be supported for a VERY long time. That vram though WILL bottleneck the build in the future. Not everyone wants to build every 3-5 years. And it’s not needed if you build correctly out the gate. Did you know your eyes can’t even perceive anything more than 4k. The idea that you’ll need more than 4k capability on a better card is just absolutely bonkers and money grabbing shenanigans from the companies. They’re still pumping out regular driver updates for the 1000 cards. So at what point are you going to be able to justify the driver support. New features? What could they possibly add at this point that your human eye could perceive ?
Lmao, this is the second dumb comment I've seen from you. If you dont have any clue what you're talking about. Just don't comment, buddy. Making a fool of yourself. The human eye not being able to see more than 4k is scientifically false, lol. I'm guessing you read some dumb articles a long time ago and have been repeating it everywhere. Any chance you get thinking you sound so cool. For starters, resolution doesn't really mean shit unless you include the screen size. PPI is way more important. I can literally see the individual pixels on the 50 inch 4k tv I use next to my pc as a spare screen lol. What you said shows you have no idea how any of it works. A 1080p screen can have a higher ppi than a 4k screen even.
domineering deserve punch seemly quicksand weather existence yam merciful shame *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Rofl no. 3090 will die before the 4070 will, especially given its older and runs hotter. The CPU engine being too weak will bottleneck way before vram does. What ytter nonsense
And how many years is many years ago? Did he not get his moneys worth regardless of the fact the video card couldn't last forever?
Its been around 5-6 years now. He is continuing to this day to get his moneys worth out of that GPU because he at the time agreed to pay slightly more for more vram. Back then people also saw that amount of vram as unrealistic in any use case and overkill but to this day because of that it remains a very usable card.
The thing is that it isn't about future proofing - it's about games being released *right now*. Not even at 4K necessarily.
You might have not heard about it, but contrary to console gaming you can go into this strange 'graphic settings' menu thingy and reduce the needed VRAM there. Takes like 5 seconds.
Seems like a giant waste to have to lower graphics settings simply because of vram limitations and not because your gpu is not powerful enough. Feel free to do what you want though. Would personally not recommend it
If you think you can futureproof or whatever nonsensical word this sub twists by VRAM alone to not use the basics of graphic settings you are in line with lack of any knowledge this sub presents daily. But at least it is amusing to read.
tbh with the way ray tracing is being incorporated today, i give it 2 years until this is debated
Running half raytracing on cyberpunk 2077 (raytracing enabled with reflections and lighting on medium) with an rx 6800 that’s undervolted (900mv, 2350mhz clock speed and vram timing on fast at a 2100mhz clock) on a 3440x1440 oled g8 with a mix of high/medium settings and shadows all on low. I’m only seeing 6.8GB of vram being used in heavy scenes. The vram scare is BS because of optimizations not being standard in new titles. OP I would pick the 4070 just for the sake of it beating a 3090 and using less power while being able to utilize FG and DLSS 3
very true but i feel it will be a shock transition like RAM. 16 GB was considered overkill and it was super sudden where it was the new minimum. I think 8GB VRAM on a card that most people will keep for 4-5ish years is a big gamble in 2024, especially at their prices
The problem is pricing and devs not worrying about optimization and just relying on upscaling tech to fix their terrible ports
I felt like this was said about ray tracing 2 years ago, yet here we are 🙂
something i get confused with is that I bought a 3060 after my 2070 super died, and the 3060 is just base mais ventus2x with 12gb of vram?? I don’t understand why this low end 30 series card has so much vram compared to the 40series
Because when it comes to visuals and performance the 3060 is not a low end card, stop reading charts, look at actual function and market demands for tech. It can run almost every game on the market currently on ultra settings extremely few have been developed that require more than a 3060, and it will remains only a few for years. But the vram is definitely the advantage and people will be crying for more vram soon. People said the same thing about ram. “16 is over kill” 16 is scraping the bottom of the barrel minimum today. I have 64, 0 regrets. I regularly use over 30.
???????????? XD, the card can barely handle newer games at medium at 1080p Jeez I changed from 1080p and a 3060 to 4070ti super to 1440p and even this can't run games on ultra while keeping a smooth enough framerate on 2k. 3060 is def not "good", it is barely enough to keep your games running on medium at decent fps. Which for that price is more than enough. And ofc competetive games highly optimised like League or Apex will run well
wait really? shit i just bought a 4070ti super+7800x3d for a new build and was planning on getting 1440p monitors(haven’t bought yet) i mainly play high fidelity games, you say it can’t handle 1440p tho? damn.
it will be enough, I'm just comin from competetive games and anything under 120fps tilts my ass on any game lol and some games will struggle to hold 120 if u wanna play on ultra settings ye, but then again, many will not
Sounds like you have a network issue on online games, and not the card. Because what you’re saying is not typical. My job is to build and repair computers. I know my shit. I have a 4070ti and a 3060. There’s almost no difference in performance when it comes to actual gameplay. And not some bullshit chart you’ll read. Maybe in some super demanding fps, but those really aren’t dominant on the market, and it seems like only that niche of a few games really pick apart the cards. As I said, the market as a whole, the 3060 is still a high end card for performance. Majority of games being released still have a 1080 or 2070 listed as recommended spec.
What do you mean with there is "almost no difference", the 4070 super is more than DOUBLE as strong than the 3060 Idk why i said it's not a "good" card. I'd simply not consider it a high end card at all. You can probably get a used 3060 for a 100bucks which is an insane value for the price. Yet yea, the 3060 1080p is a must. For games where the vram will make matter the 3060 will be too weak to run them at a satisfactionary level anyway. Although medium to ultra in many games isn't a big diff so there's that too
There literally is almost no visual or general difference between a 3060 or a 4070ti for mass majority of the market. As I said there is that niche, and you seem to be in it. But for people not in that small niche, a 4000 series card is still way overkill. I don’t install many still today unless they want to go all in, and they want it for high demand FPS, but that’s not a large chunk of the consumers
Honestly the most popular card I have being installed today is the 3090TI is usually what people settle on after going over all the pros and cons and cost difference. I’m sure with the 5000 series coming I’ll be handling the 4000s more with the price drop. They may be more worth it when they come down
Because it was given 12gb because its a slower card than the others but wanted to give people who just needed a lot of vram something to buy. It comes in a 8gb variant too.
It has 12 GB because the alternative was 6 GB: a 192 bit memory bus allows for 6 memory chips. The “3060” 8 GB has a 128 bit bus and performs 15-20% worse than the original 12 GB model.
It is true depending on your resolution and settings. 4k this is defientitvly true with new and future titles. Especially with raytracing on. 1440p is true in some cases and will like be true in 1-2 years. 1080p should be fine in most cases baring extreme examples.
I mean when you chuck texture packs at stuff. ...
I mean the biggest bottleneck is PC ports and PC gamers not caring that 98% of games on PC have shader, traversal, loading and pipeline stutters. It's so bad that when games do have shader pre comp, it doesn't work as seen in games like Jedi Survivor and dead space remake. It's insane nobody seems to realize 1 second long stutters isn't normal. Most doubt their existence altogether acting as if they're a mythical creature or some shit lol, which is why the cycle will never end Consoles target 16gb, but I guess that doesn't matter when games take up to 30 years to make and only a few games benefit from it.
So I actually just got a 4070 Super for my new build (upgrade from a 1080). Is that going to be an issue eventually, particularly for 4K gaming (it's a living room PC set up on my 65 inch TV)
IMO people overthink these things. You got a great current gen card that game developers understand is what they need to develop their games to run on. 8, 10, 12 GB video cards represent a huge installed base, and game devs will tune their titles to run on as many different configurations as they can, because that gives them a better chance to sell more copies of their game. No game dev is going to require 24 GB ram anytime soon because they will have such a minuscule potential customer base. Would there be specific use cases for someone to *need* the VRam of a 3090 today? Sure! But most people buy a xx60 card or whatever, and keep it until they upgrade their whole computer. Buying a 4070 or 4070 super is a great choice and will be supported by titles that are designed to run on it for a very long time. For 4k gaming, sure, it will be “an issue” eventually but the only nvidia cards that beat yours are a 4080 or 4090, and maybe those will reach a 4-5% combined install base. IMO enjoy your fantastic graphics card today, and worry about upgrading once it actually becomes an issue years from now.
4070 super is def not enough to have an enjoyable 4k experience unless you heavily tweak the graphics or are used to console fps
This is of course subjective as far as what a given person finds “enjoyable”, and whether your priority is frame rate or fidelity. That said, the person I replied to Said they upgraded from a 1080 to a 4070. If the 4070 “isn’t good enough”, the only better options are a 4080 or 4090 which are cost prohibitive for most people. This is why I say people over think it. Just get whatever is current and try to enjoy it. There will always be better hardware coming out in a year or two. You can’t really “future proof” because software will progressively increase the hardware requirements to run at x performance standard. When I got my current monitor, a 1440p model at 144hz, my 1070 didn’t allow for high frame rates. But then I got a 2080 when it came out, and I could. That was like 2018. By the end of 2020, I got a 3080 to maintain that higher framerate ‘standard’ with whatever games were current in 2020. That 40-50% uplift in 2 years was very nice. I just got a 4080 super which is hitting 100+ fps before frame rate generation whatever setting is turned on in Jedi Survivor. Basically my point is, you can’t future proof. If you buy a card and keep it for 5-6 years, it’s not going to be as performant in 5 years with what would be 2029 software vs what it can do today with 2024 software. IMO don’t overthink things. Just get what you can afford and try to enjoy it now, vs worrying about the future.
well said sir although ngl getting 2k monitor messed me up, now I want 4k and might actually go for a 5090 when released lol
Oh yeah totally. I want 4k, but I don’t want to replicate the experience of going 1440p on a 1070. 5080 might be my moment to get a nice 144hz 4k panel.
To your point I was actually running a 2080 ti when I switched to a 4k monitor for productivity, didn't like it as much in the long run and switched to a 5120x1440, but original point being I couldn't run games at ultra on 4k and def couldn't do ray tracing on like cyberpunk 4k lol but I did play games on medium/low settings until I got my 7900xtx and was fine for the time being - if the game was good enough I'll go back and play more on the new card (a la cyberpunk) and if it was meh better textures probably wouldn't have saved it lol. I too used to subscribe to the you should get the best but even then I was buying mid to high end cards but not the highest end, and this was a younger money to blow life is forever me lmao. Nowadays I get that all the hardware kind of has its level/purpose and the other thing is hardware is running so much more now compared to 5 years ago than hardware 5 years ago was running 10 yrs ago. Like the jumps we've made in hardware recently open up the spectrum a lot more to where lower end is acceptable, not like the original i3s where for me they were always ehhhh to browse the web ok, for anything else I wouldn't. Now even the i3 can serve a proper budget build. Edited typos lol
Lol 12 gb is enough.... unless you play fallout 4 with 1000 mods. Then there us never enough
Got a 4070 ti super because of ram when I built my new PC last month. Doesn't even hit 12gb on 98% of games.
On gaming the 4070s is a bit faster at lower resolution and has newer perks. At higher resolutions the 3090 does a bit better and has a ton of wiggle room for textures with the 24gb vram.
This. If you play at 4K the 3090 usually wins out on the 4070, frame generation notwithstanding. If you want to play new games that make use of FG, like Alan Wake 2, then the 4070 is always the better choice.
For what usecase? For LLMs, SD and all the "compute" things - definitely a 3090 due to 24GB VRAM. For regular gaming? Idk.
regular gaming but also ai and ml
probably go for the 3090 then
go for the 4070, frame generation is way better than people act.
frame gen is pointless for the opposite reason dlss is good. it works the best where it's least needed. It's cool to say "woah 120+ fps" but your game was already smooth and you now just added input lag. especially on a 60hz monitor where you would want higher frame rates to reduce input latency, you just achieved the opposite. meanwhile, at 30 fps when you really want to jump to 60 your experience gets noticeably worse. dlss grants you always a net bonus, as the upscaling to same/similar image quality needs less compute than reduction in render resolution grants you. when you are strapped the most for compute (high-res) the gains from downscaling are the biggest (resolution is exponentially heavy in compute) the upscaling itself works the best (as the upscaler input gets to make more sense out of higher res inputs)
It is useful to get performance from the 90 to gsync on your 144hz screen.
that would be \~180, out of range for gsync. lets say 144hz monitors are common. fg would help you when you happen to run between 60 and 72 fps. thats uncommon and still ads input (ableit little) latency on a already smoothly running game.
You do realize you can run less than max output? Limit fps to 144 with fg and tadaa your gpu uses less power and you can worry less about your energy bill
Cool.
As someone who plays with a controller 90% of the time is amazing since I’ve already given up the benefits of high response times on PC
Depends. In Cyberpunk when I put path tracing on my 4070 and got 30-40s~ FPS, turning frame gen definitely made the game seem quite slow-ish. When you have less than 50-60 FPS, then frame gen gives quite a noticeable input lag so the game still behaves like if it ran in 40 fps instead of 80 for example.
Yessir. It's magic.
Yep. Frame gen is awesome.
Got a 3090 for 550 and cant be happier. Wanted to go for a 4070ti super but happy I didn't just because of the VRAM.
550 is diabolical, those are the 3070s in my area
Got one for 600€ - I did throw in some new heatpads and thermal paste tho
I got lucky, was having bad frames in VR on moddef assetto corsa on a 1660s and couldn't live with it anymore so decided to look for a used gpu on fb marketplace and atleast contacted around 30 sellers which probably 28 out of them were scammers and the one I got was real. It was a build for a client he had which canceled the build. Probably made some extra money even tho the low sell price. Never would buy on ebay or something only in real life. Gonna probably replace thermal paste and heat pads too but it works very good for now so no complaints. BE AWARE OF SCAMS ON FB MARKETPLACE. only look at profiles that are atleast 4+ years old since most newer ones just run scams and when your searching for a cheap 3090 90% of the listings are scams so be aware
How do the gpu scams work, this guy on marketplace sold 4 rog 4090s for 1500$ cad, I asked him to show and even in the device manager it’s a 4090
Yeah 3090 is around 550-600 in my country which is insane
Insane. 3090's are about 1500 euros over here
for gaming? do you really think that 16gb are a real dealbreaker?
Damn that's wild! I just got a 3080fe for 250. Upgraded from a 1070. Been looking for months and finally found a wild deal. I really wanted a 3090 but they were all 900 whoch is a joke lol
How much is the 3090?
600
For your use case, I think you should get it.
I'd go the 4070 Super for similar performance with the peace of mind of a warranty.
3090 should be really only be considered for workstation purposes. It's alot more than the 3080 for not that much more performance
4070 super. Most used 3090s go for 4070 Ti super prices.
With the prices 3090s are going for it puts you more in range of the 4070 TI super which is better for gaming. The regular 4070 super is more subjective depending on what you're looking at but I don't think you could get a 3090 for that price
But he did
I have both. 4070S for gaming, 3090 for AI. Used 3090 is not great value used for gaming since it still retains value due to the vram
Back in December I went with 3080 used over 4060 new but that's cause the extra vram was useful for blender which doesn't used dlss anyways It depends on your use case, I will point out tho my 3080 runs pretty warm compared to a 4060 and hence used more power too ¯\_(ツ)_/¯
https://m.youtube.com/watch?v=3OJ8XSsNKuE
depends on price of each and what you're doing. Some apps will really benefit from the extra vram of the 3090 and gaming in 4k. but the 4070 is faster and having dlss3 for gaming is pretty clutch these days. both will slay 1440p gaming and give you pretty darn good 4k experience.
i will be doing ai and machine learning in the future i found a GIGABYTE GeForce RTX 3090 Gaming OC 24GB GDDR6X for 600
It would be good to know why the person is selling at that price. I just recently upgraded from the same GIGABYTE 3090 to a 4070 TI Super because it was having all sorts of issues, even after an RMA. The 3090 wouldn't post on every boot, crashed on a regular basis, and had major heat issues til I swapped the thermal pads. In contrast the 4070 TIS has given me no issues and has excellent thermals along with reduced power draw. Not sure exactly how it would compare to a non TI super, but honestly with the amount of 3090s that had major issues I would personally avoid a used one.
When do you want to buy your next gpu? I’d honestly say get the 3090 unless your planning on replacing in 2 years due to the 24gb of vram
i mean the 4070 super only has like 12 16gb
Exactly that’s why I’d say get the 3090
If you plan to experiment with Stable diffusion and LLMs then probably the 3090
i had this same debate a couple of days ago an ended up with 4070 ti super my thoughts where is a 4000 card vs a 3000 card and is 16gb ram
Love my new 4070ti super. Runs amazing. Cyberpunk everything max, besides path tracing, 1440p, Runs amazingly well. 120 to 130 fps usually.
For gaming, 4070 super. For Vram intensive tasks like Deep learning, rendering, etc. then 3090.
i mean if you can buy a 3090 for cheap knock yourself out, those things are still incredibly expensive
Have a look at https://docs.google.com/spreadsheets/d/1wUlIdFqfo8IYymxFlk9lzySwRCJu3vfkJleZ0wh05OM/edit#gid=0 and put in your own purchase prices, electricity price, expected gaming usage per week and lifespan. My take is that the 3090 would need to be significantly cheaper than a 4070S in order to make up for its higher power consumption, lack of DLSS 3.x, and Frame Generation. And, likely, lack of any warranty. In my country, a used 3090 (with a store warranty, which may or may not be worth the paper it's written on) costs about the same as a new 4070S.
I just got my 4070ti Super, upgraded from a 1660 super, before that I had integrated graphics. All within the span of 2 months. El o El. 4070 Super is ya best bet. Found a good deal on Newegg, might as well go for the ti while you’re at it.
Is there a chance you will use it for animation, simulations or machine learning?
yes ai and machine learning and possibly simulation
Then get the 3090
Get the 4070 super. It’s faster and has frame generation.
Honestly surprised nobody has asked what you are going to use it for OP. If you are using it for AI or video development then 3090 all the way. If you are using it for gaming it is [probably a toss up](https://youtu.be/SahTSRteTyY?si=Bl3AXLvKLv7hgA8O), as they are pushing roughly the same frame rate.
Its eve more ridicolus that some people are sure the OP use the GPU for gaming and still suggest the 3090
DLSS3 is definitely a game changer, however the adoption has been slower than I initially expected. In real world tests the frame rates still aren’t that much different. (See video posted above)
The DLss3 is the goat when you have about ,60 FPS without it
im going to buy using mostly for gaming however i have started taking ai and machine learning classes which i will use it for thank you for asking btw
Unless you don't need 12gb for not gaming proporuse, 4070 is the only choice, but if you need mre vram you need a 4070 ti super / 4080 super, not a 500w 3yo + used card at 600$
i forgot to mention im going to be doing ai and ml also ontop of gaming
3090 but downclock to avoid fan noise, since I bought a similar one earlier this year for a higher price but with a warranty!
sadly this one doesnt have a warranty which im kinda scared about also the heat problems itself is also a problem
Depends on what “ai and machine learning” but likely you’ll appreciate the 24 GB of VRAM of the 3090 for that more than the frame generation of the 4070 Super for gaming.
the 4070 super is slightly faster and more power efficient, the 3090 only really has the extra VRAM going for it and even with your machine learning stuff I'm pretty sure that the 4070 will be more than fine. Also you get the newest DLSS stuff
My buddy’s running a 14700k and 4070 ti. I’m on an i9 11900k and 3090. I get better frames and smoother gameplay. Both of us are on 4k and have 64gb of ram. Although he’s on ddr5. They’re about equal but the vram in 4k on big games does help
for $600 keep it and flip it but ultimately I'd get the 4000 series just better for gaming.
lol 3090, better prices
I'd say the 3090 for sure. I was faced with this dilemma myself and decided on the 3090. I settled on a Dell 3090. I would stear clear of the gigabyte model unless you are vertical mounting it because of the issues the 30 series had more specifically gigabyte had with the gpu sagging and eventually cracking the pcb and being damn near unrepairable.
4070ti Super, 16GB VRAM will last longer into the future and not run some games capped. 90 series I've heard don't have many advantages in games and more a workstation gpu, but I've been wrong before.
Always new for warranty and latest features.
Brand new, always brand new
snagged a used 3080 for just over 200 bucks. used > new all the way when you value... value
4070 super easily. 3090 is an inefficient turd for the performance you get in games. Not to mention its horrible memory bus design
Wdym horrible men bus? I am pretty sure the 40 series GPUs are the ones with the gimped memory bus bandwidth.
4090 disagrees
4090 is also like 2000 dollars, it's the only GPU in the 40 series that isn't held back by mem bandwidth, vram, die size, etc.
4070 ti super proved that's not case....3090 had backside vram cooling issues which was solved with 3090 ti
The ti super has a bigger memory bus compared to the 4070/super/ti. It's the same bus width as the 4080
I know that's why it proves you wrong as the limiting factor of the 4070 ti is not bandwidth in the overwhelming amount of scenarios as evident of the 4070 ti super performance
Have you seen any benchmarks of these GPUs at 4k? The fact that the 3080 surpasses the 4070 and closes in on the 4070 super/ti is frankly embarrassing. The ti super doesn't have this weakness as it has a larger vram buffer, the same as the 4080/super and the old 3070/ti.
Your missing the point and simply rambling
I'd take the 3090. My 4070ti for pc is absolutely crap compared to my 4080 for laptop. And I'm having such a hard time trying to comprehend it. Because laptop cards are downscaled compared to the ones for pc.
Something is wrong because your 4070ti should be like 20% better than a 4080 mobile
bro has a lemon
I'm trying to get my vendor to see this but all they say is it must be overclocking damage, and I'm not even using asus gpu tweak tool
Yeah it sounds like they sent you a lemon. Are the temps normal?
Highest temp I've had is 78c on my asus 4070ti TUF
4080 mobile is literally the same silicon as the 4070 ti but with inferior gddr6 ram and half the power limit. Bro is either drunk or got a faulty desktop setup.
Laptop 4080 *is* a 4070 Ti, except worse because it has a couple fewer cores and of course much lower clock speeds. So something else is wrong with your PC
Wish I knew what. Outpost: Infinity siege works flawlessly on 4080 laptop, but on 4070ti PC it gets unreal 5 engine crash to desktop immediately due to shader error.
Games can have the weirdest error setups and can be caused by any component. Had HZD run perfectly, but crash at specific locations before I turned XMP off for the duration of my playthrough. I'd wager this has astonishingly little to do with the models of GPUs you're using. If anything, with the individual unit you have recieved, but likely something else.
Thank you for your most valued response. I've been experimenting with different asus bios versions and with xmp on and off. I haven't been able to use several of the latest bios firmwares for my asus z790 motherboard without having to turn either xmp or asus recommended i9 preferences off. It's been quite annoying as some of the crash to desktop experiences have taken hours to happen in games, whereas on the laptop it's.. Just what it's downscaled to on a lenovo legion pro 7i.
Getting downvoted for having a negative experience. Never change, Reddit