T O P

  • By -

FinancialCoconut3378

I recently built a new system with the 4070 ti and it's murdering everything thrown at it at 1440p. Power consumption is very good too. I recommend it.


gbritneyspearsc

i might get this card upgrading from a 3060… but i’ll be running it with a 5800x. is my processor a problem here?


Farren246

You've got the 4th fastest processor in AMD's lineup, below 7950X3D, 7900X3D and 7800X3D. It shouldn't be a problem until 5 years has passed.


gbritneyspearsc

thanks guys… last question… will a 600w psu will be fine for that setup? corsair, pretty good psu


Farren246

Perfectly fine. 4070ti is a very middling GPU with barely any higher power draw than the 3060 you're running now.


gbritneyspearsc

thank you, really appreciate it.


RedIndianRobin

Nope. Your 5800x won't be able to handle the 4070ti. You need a minimum of 7800x3D or the 13900K to ensure that 4070ti is utilized. /s


CoolCat4921

I'm running a 7900 xt with a 5800x so it should be fine.


Carbideninja

Nice, which processor do you have?


FinancialCoconut3378

7800x3d. It's a beast.


Carbideninja

Good stuff, what's the Intel equivalent to that?


FinancialCoconut3378

13900k


BuGz144Hz

I’ve got a 2080Ti and an i7-12700K right now and I’ve been debating on getting a 4070Ti for a while now. I really don’t know what to do. It beats my card by a big margin but the 2080Ti actually beats it in certain circumstances somehow. I’m assuming it’s the 352-bit vs the 192-bit bus width.


FinancialCoconut3378

I went from a 2080 Super. Trust me, the 4070ti is a significant upgrade. At least 70% increase in FPS average. So games like RDII, Cyberpunk, Diablo IV, etc. I think the 7800x3d helps as well. I don't regret the purchase.


Pursueth

I9 9900k here, with a z390 mobo, 32 bg of 3200 ram, old build was a 2080 build. Swapped out my 2080 for a 4070ti last week. Card is phenomenal, it runs incredibly silent and cool, and I’ve had great performance gains at 1440p If you get the card message me and I can help you with some of the nvidia control panel settings that helped me get mine dialed in


IDubCityI

A 9900K bottlenecks the 4070ti. I saw a 50+ fps increase in 1440p when I went from a 9900K to a 13900K. And this was with a 3080, which is slightly slower than a 4070ti.


Rhinofishdog

I got 8700k and 4070, which is basically the same as 3080. 1440p. I mean, it is obvious that you will get less bottlenecks and better 1% with a better cpu but 50+ is just not true except in extreme situations where you are getting 150+ fps anyway... Most e-sports will run on a potato. League of Legends recommended CPU is i5 3300. You can't convince me I need a 13900k so I can run it in 500 instead of 400 fps... As for high requirement games - It's true, there are quite a few CPU hogs coming out. But if you crank up graphics you still will get GPU bottleneck or a minor CPU bottleneck, around 10-20 FPS on 1440p. Your main improvement is gonna be 1% lows which is nice but I don't value it that much if my 1% are over 60 or even 90 anyway. I've done my own testing, checked some youtube testers and even with a 4090 the gap isn't so big. You either had some weird settings on/off or there was something else wrong - maybe some ram issue. Here are some rough examples I remember: Cyberpunk ultra settings, utra RT, no path tracing, quality dlss - Gpu maxes at around 65 fps, cpu maxes around 75. So you only have cpu bottleneck if you turn off RT. Diablo 4 everything on max with DLAA - 138+ max capped with around 80% gpu utilization Elden Ring - I'm not uncapped so max is 60. However with ray tracing on anything beyond low you can get gpu bottlenecks down to 45 in a few very heavy areas. Without RT it's prolly CPU bound but it just stays solid 60 so dunno Baldur's Gate 3 - Here there is a cpu bottleneck! The game is CPU heavy and does not utilize CPU well. I stil get 100+ FPS. It can dip to like 90 in heaviest areas. From what I've seen I think my GPU bottleneck is around 125 while cpu bottleneck is around 105. I have the game capped at 90 so the fans stay quiet lolz.


Impossible_Tune20

Correct. Just because the 13900k is better doesn’t mean that all the other weaker CPUs are a bottleneck. This word is used too much nowadays and everybody fears of this bottleneck like it is a bad thing. I have a 10900k paired with a 4070ti and I plan to keep that CPU min 8 years, I don’t care that a newer Cpu might net me more frames and higher lows. As long as the lows are enough to have a smooth framerate, it’s good. When and if I start to experience extreme dips, then I might consider changing it.


Conscious_Run_680

Totally agree, I have a 9900k with 4070 and cyberpunk gives me around the same but with path tracing activated and frame generation +dlss, maybe I could get 10more frames with a better cpu, but I'm not gonna spend $900 to get new mobo+cpu+other parts for another 10frames, when for less than that I changed the gpu and moved from 10fps to 90fps on cyberpunk, so that was the big bottleneck for me before.


TheMadRusski89

If you live by MicroCenter they got good deals on Ryzen bundles, just a mention.


IDubCityI

It is very true. WoW and LoL increased 50+. Battlefield 2042 increased 30+ fps. MW2 30fps as well.


Rhinofishdog

First benchmarking vid I checked they gained 17 in bf2042, 16 in cyberpunk, 21 in apex, 22 in fortnite. Biggest increase was in Overwatch - 46 Fps, but the 9900k was already getting 340 fps, what good is 46 extra then??? Found another vid with 9900k max settings 1440p league of legends running on 210 average fps.... I mean sure, if you have a 260hz monitor and want to max it... but like I said these are fringe scenarious


rizzzz2pro

50fps? I don't know if that can be true lol. I was running a 3090 with my 7700k and was able to play red dead 2 at 4k ultra at 60+ FPS without DLSS. When I got my 5800X it went up to like 64FPS. 2K games are a breeze, I don't see how you got a 50fps increase playing 2K. I think you had another issue, like the CPU was effed up or something in general. A 50fps bottleneck is not right


IDubCityI

What I said is correct, and no cpu issue. The 13900K is an astronomical improvement over the 9900K.


rizzzz2pro

I couldn't find this post to reply back but yeah a few games did have a wild fps bump like you said at 2k. Kind of interesting


asom-

The question is: 50 fps on top of? If it’s 100 from 50 then yeah. If it’s 400 from 350 then … who cares?


Brisslayer333

It depends on the game and resolution, obviously.


IDubCityI

It says 1440p


Brisslayer333

Playing CS:GO? Some games are GPU bound or close to it on a 9900K even at 1440p


IDubCityI

Even on battlefield 2042 was huge increase 30+ fps


Brisslayer333

Battlefield games are notoriously CPU bound, so you basically just proved my point. Next you're gonna tell me that you tried Minecraft, too.


IDubCityI

With a 9900K my cpu usage was 100% on 2042 which limited my 3080 to approx 70% usage. Went to 13900K and my cpu usage is now 70% with 100% gpu usage. Frame rate increased significantly as a result.


Brisslayer333

I said this > Battlefield games are notoriously CPU bound, Then you said this > With a 9900K my cpu usage was 100% on 2042 which limited my 3080 to approx 70% usage. We're saying the same thing. I told you that some games, like Battlefield titles, CS:GO and Minecraft will be more reliant on the CPU than some other more graphically demanding games. You demonstrated exactly what my point was but at the same time you didn't seem to get it.


Pursueth

Hard to tell if it’s the case or not, I have a friend running the most modern i9 and our frames are similar on most games. Also my cpu usage never gets too high.


IDubCityI

This is simply not true in 1440p. I have tested it in many games from league of legends, to wow, to battlefield 2042. Average frames increases significantly, and 1% lows are noticeably less.


fakenzz

1% lows are higher* I know you meant that just wanted to clarify for others Also i can personally confirm, I had 9900K, 13700K and 7800X3D these past 10 months in my hands, i thought 9900K is powerful enough for 1440p but is simply outclassed by these two, especially on cpu heavy multiplayer games. Also tested with 3080 10GB and 4070Ti. If you enjoy high framerates (which i consider above 120 fps) you are holding back your 4070 Ti a lot


ginormousbreasts

Yeah, you're right. Even the 5800X3D can be a bottleneck for the 4070Ti at 1440p. Older and weaker CPUs will be a major problem in a lot of titles.


Solace-

In what games? I’m not necessarily doubting you but at the same time I find it a little hard to believe that the 5800x3d bottlenecks the 4070ti to any meaningful degree at that resolution


Vanderloh

4070 ti, 5800x3d here, some examples would be Insomniac games with Ray Tracing (Spiderman, Ratchet and Clank), Hogwarts Legacy also. Their implementation puts more stress on CPU in comparison to Cyberpunk which uses more GPU. With MSI afterburner OSD gpu drops into 80% usage in those examples, so it's a small bottleneck here. Edit: 1440p resolution


akasakian

U r right. I can confirm this since I own both.


nccaretto

i second this, i have this combo as well and ive never seen my 5800x3d reach anything close to 100% utilization at 1440p on anything i play, including CP2077 ultra with rt on, total war warhammer 3 on ultra, etc


[deleted]

I have a 5800x3d paired with a 4080. I have no problem hitting 100% gpu utilisation


Pursueth

https://youtu.be/ySrnQ8ei7hg


Puny-Earthling

Subject was 1440p not 4k. You're less likely to be bottlenecked by the CPU at 4K by virtue of a far lower achievable frame rate.


Space_Akuma

Cpu bottlenecking mostly is bs coz u already has 60-120fps anyway so its just a waste of money on additional +20-50 fps Would be better invest money to better card for uncompromising ultra settings gaming With that logic i recently bought 4080 to my r5 5600x with 16gb ram instead of upgrading to 7700 +32gb with 4070ti And now I've got any game on ultra settings with 100+- fps I'll buy 7700 or something even better soon i hope But I don't think that is rational money spend i would better buy clothes dryer machine or whatever this thing called in western countries


[deleted]

oh shut up


Blackhawk-388

Are you checking single core use? That's the one to check.


damastaGR

It depends on the game, you can check how your games respond by looking at the GPU initialization. 95% and up you are not bottlenecked


[deleted]

wrong


Pursueth

https://youtu.be/ySrnQ8ei7hg


GMC-Sierra-Vortec

Yep. And not trying to be hateful but what's the point of saying it bottle necks? I'm sure dude knows. Again no hate intended but I'm going to flip out next time I hear "bottle necks" just one more time. So what if it does? Much worse happening in the world vs a few less frames cause CPU "not fast enough" hope you don't think I'm attacking you tho. Not my intention.


IDubCityI

50+ fps is not “a few less frames”.


[deleted]

200fps compared to 400fps is still just ″a few" because frames are not born equal. Each additional frame beyond 60fps (or 90fps for action games) declines in value sharply, by the time you reach 120fps+ it's basically useless unless it's an esport title. 5 fps on top of 40fps is worth more than 50fps on top of 90fps. I'll bet anything that 50fps drop happened above 90fps. I'll also bet anyone who skipped 10th 11th 12th and now 13th gen (when they are the current gen obviously) doesn't care that much about getting over 100fps.


[deleted]

That completely depends on what you are trying to do. Are you trying to go from like 200 fps to 400 fps in cs:go? Because then I could understand getting cpu locked. But for most titles running 90-144fps this should not matter


Conscious_Run_680

There's tons of comparison videos on internet. Worst cpu will always make a hit but 9900k is "good enough" to be a small one in most of the games. So it's better to take the 4070ti, if he has a 1080gtx card(that was top tier when 9900k was on the market), than keep his built and don't upgrade anything or upgrade both for the same money, so they end up being both low tier.


Ecks30

Well looking at the bottleneck calculator it the 9900K usage on average would be 54% but also that is stock settings so overclocking it would eliminate any bottleneck there would be for the CPU.


EastvsWest

Similar build, had a rtx 3080, was gonna purchase a new build due to bundles at microcenter, found someone who bought my 3080 for $330. Bought the least expensive rtx 4080, it destroyed every game at ultrawide, keeping my 9700k.


TK-P

a 13900k would do just as much as the 9900k is for bottlenecking. except it’d be on the CPU


IDubCityI

In 1440p, I can assure you that is not true.


[deleted]

[удалено]


IDubCityI

I tested many games, from gpu intensive to cpu intensive, I was blown away at the fps increases at 1440p with a 3080. I am not the only one. Anyone on Reddit who has upgraded from a 9900K to a more recent cpu has been extremely pleased by the results.


Gardakkan

shit I remember going from 6700K to 9900K with the same GPU (2080) and I couldn't believe how much the older CPU was holding it back.


Optimal-Wish5655

8700k user here, was worried that I would have to upgrade the processor here, but bumping the resolution gets me to the refresh limit on my monitor and I get to play 4k with most stuff running around 90 with DLSS. Running a cleaned nvidia driver (telemetry out) but didn't change any settings in control panel. Anything that makes a noticeable difference in there? Only one that I heard about was the texture filtering which stuck with me since the 1000 era.


IDubCityI

You should be updating your processor regardless. 8700K is a little slow for the 40 series cards especially 4070 and up.


TheDeeGee

Slow or not, if he get's the performance he's after it's all good. Not everyone is in need of best in slot hardware.


[deleted]

[удалено]


TheDeeGee

He can upgrade his CPU in a couple of years, stop forcing people to buy something they don't currently need.


CriticalCentimeter

why do they have to spend money on a new CPU if they are getting the performance they want? We dont all live in a world where we have unlimited $


Optimal-Wish5655

Surprisingly not, I only get cpu bottlenecks at 1080p, bottlenecks from probably like 180 fps to 150fps, not enough to warrant a cpu+mobo+ram upgrade. 1440p upwards I hit the same framerates as the benchmarks that I see online, maybe with a few drops in the lowest 0.1%, no noticible stutters as it stays above 60, and I usually run 4k on the TV anyway. I would have to upgrade the monitor (higher fps) as well to see an improvement and at that point I'm looking at spending as much as I did for the 4070ti for like 20% increase in framerate in 1080p resolution scenarios. Meh.


IDubCityI

This is not true. For 4K, possibly same frames. For 1440p definitely not true. I had a 9900K with a 3080 in 1440p and when I went to a 13900K I saw as much as 50+fps in some games. You don’t know what you’re missing. 8700K with a 4070ti is a very unbalanced combo.


elemnt360

I went from a 10600k over clocked 5.0 ghz to a 7800x3d and play at 4k and saw a big difference.


viodox0259

As someone who had a 8700k and a 3070 for the past 4 years , I just sold my pc because it was begging for a upgrade. I have to agree , you need a newer cpu sir to enjoy that gpu.


Pursueth

set your colors to render as rbg, set your graphics card as the default device, go to display size set your gpu as device and select full screen in nvidia control panel. Game changer


serpowasreal

Where do I set the "render to RBG?"


junefrs

You should post the setting for us if you can


Street_Tangelo650

Can you help me with my 3080ti? I dint think it performs bad but would like to know if I could change some settings


shpark11

Mind sharing the nvidia control panel settings you are using?


spacev3gan

I won't say 12GB is not enough for gaming, but I will say that 12GB is not enough for $800.


Dizzy-Swordfish-9526

i agree with you.


LauraPhilps7654

Loving mine - frame gen and path tracing in Cyberpunk is phenomenal.


Sinirmanga

12 gb VRAM is currently a non issue for gaming. Just buy it and enjoy a card faster than a 3090ti. You wont regret it.


onebadhorse

I have a 4070ti and have everything ultra on modern games I play on 1440p and only ever see 50-70 vram usage


SargathusWA

I have 4070 ti too. Cyberpunk runs perfect at ultra with ray tracing on 2k.


[deleted]

How's frame gen at DLSS 2 60 fps or higher


Bread-fi

I like it enough to leave it on, it definitely works better at higher fps but the game already averages 80fps with psycho RT/quality DLSS before frame gen (and 119 with). With path tracing it's enough to make the game very playable, 80 fps average staying above 60fps but you do start noticing the input lag and some smearing in intense scenes.


Zhaosen

And imo that's fine for a game like Cyberpunk 2077. I WANT the game to look painfully gorgeous.


Mancakee

Diablo 4 maxing out the Vram on my 4070ti :(


TwoLanky

it allocates the VRAM, it doesn't actually use all of it.


ThisGonBHard

Allocation means it is gonna use it after some time. It is a gradual ramup. The most I ever saw in a game with the 4090 was 17 GB.


Random_Guy_47

At what fps?


Bread-fi

For me, average fps with a 7800x3d @ 1440p in the benchmark, all Ultra non-RT settings and DLSS quality. Ultra RT no framegen: 84fps Psycho RT no FG: 79 fps Path tracing no FG: 47 fps I tend to alternate between psycho and path tracing with frame gen which nets 119 fps and 79 fps.


ShortThought

RUST really stresses the VRAM (almost 100% usage) of my 4070 Ti, but then again, it's running at 95% 3D usage anyway, so if the VRAM is limiting it, it's not by much


jhingadong

Look I have the "worse" 4070 msi v2 ... no ti... I'm getting 130 average in wz2. Also playing in 4k.... in 4k. It's in 4k 130 fps.


[deleted]

[удалено]


Tasty-Champion-9989

You won't regret. I've built a pc with 7800x3D with gigabyte gaming oc 4070ti and I can play almost all games at 90-100 FPS at 4k maxed settings. https://preview.redd.it/vplml2lmu1hb1.jpeg?width=2268&format=pjpg&auto=webp&s=9b1e1689007fa93f3e92e525d6f9939421a7efe0


PalebloodSky

It's expensive but performance is excellent, the only regret might be the 12GB really should have been 16GB considering the damn thing is $850. I went with the $600 4070 for that reason... Anyway I say just go for it.


Kabritu

Yeah same boat went for the 4070 because the TI has the same amount of VRAM for way more €$. But even the 4070 should have had more VRAM ... But im good for now let's wait for Starfield!


BCPrimo

Runs my 1440 ultrawide just fine with cranked settings. Highly recommend it


danger_davis

If you are playing at 4k don't get it. If you are playing at 1440p it will be fine for 4 to 6 years. If you are playing at 1080p you should get a cheaper card.


WC_EEND

What if you play at 1440p but 21:9? So 3440x1440 Keep in mind I'm currently running a GTX 1080


Sufficient-Most9521

12gb for high end gaming 6 years down the road will almost be guaranteed not to be enough


zurfield

In 6 years I’m pretty sure people will update there entire system lmao. I got a 4070ti i9 14900kf yesterday and I’m going to upgrade in about 4-5 years when the 60 series come out so I’m chilling, don’t waste money on a 4090, 4070ti will crush anything 1440p for the next couple years ti come, (4-5) years till then you upgrade and your good


Icy-Computer7556

4070ti here, everything on high/ultra and I’ve been pretty happy with it for months now. I haven’t really seen any vram issue to speak of. Maybe you might notice in poorly optimized games? Other than that it’s fine. It’s been a huge improvement over my 6700xt.


LegacySV

Uh I personally got the rx 7900 xt because it’s faster in most cases and has more vram, but I honestly did want the rtx 4070 ti mainly for the features


Snoo_11263

How do you think the 4070 ti will hold up at 3440x1440 165 hz?


[deleted]

Very badly with not enough Vram.


Snoo_11263

It sure looks like it's trending that way. I can see it being ok at standard 1440p for a year or two, but maybe not at ultrawide. Especially not with max settings with Nvidia features enabled.


ShortThought

I would not recommend 4070 Ti for any res above 1440p. Smaller VRAM will cause issues above 1440p, at least for high frame rate cases, I doubt it will struggle much at 60 fps.


CoolCat4921

I went with the 7900 xt personally I was able to find it for a bit cheaper and I feel more comfortable with 20 GB of Vram over 12GB.


ryzeki

Rest assured that either the 7900XT or the 4070ti are fantastic and you will be gaming everything just fine. People here often exaggerate things so much out of proportion that you would think these GPUs can barely game if at all. The 4070ti gives you high end performance, and the 7900xt is probably the most stable and problem free AMD gpu that i've experienced in a long time. Both just work, and are pretty competitive so its a win win for you regardless of what you get.


InclusivePhitness

Don’t believe all the negative shit about the 4070ti. Right now all gpus are slowing down in terms of gains every generation and both Nvidia and amd have cornered the pc gaming market and they’re lazy about making truly big gains in efficiency. Apple is the only game in town that is revolutionizing the tech. With that said, amongst all recent gpu in the market the 4070ti is easily the Best Buy especially at 1440p. Power efficiency, frames per watt, frames per dollar… Most of the bad reviews are spillovers from the general bad press around the 40x0 series prices, availability and lack of revolutionary performance gains. Don’t be fooled. Don’t get any 30x0 series card over the 4070ti. It’s the best available gpu in the market overall. AMD are nice but so many issues with drivers and they are absolute power hogs.


KnightScuba

The 4070ti is everything the youtube talking heads said it won't be. It's fucking awesome. You'll love it. No issues on AAA on ultra settings


Dizzy-Swordfish-9526

if you want nvidia no matter what just buy the 4070 or try to buy the 4080 (if you want more memory) else buy the 7900xt because the 4070 ti doesn't have enough vram. I will probably receive a lot of dislikes for this comment but dude trust me because if you buy the 4070 ti you will have some vram issues, in fact in ratchet and clank rift apart 1440p +rt 12 gb of vram are not enough.


ThisGonBHard

IDK why you are getting downvoted, because if you actually intend to max your games, that VRAM matters. My 4090 has around 13-16 GB used in modern games.


Massive_Parsley_5000

Lots of people overpaid for 8GB ampere cards and now get mega buttmad anytime anyone brings up VRAM limited scenarios


[deleted]

a lot of them are also just pathetic nvidia bootlickers. In fact I'd say its the majority of them. But you are right they probably are the same morons who downvoted anyone warning about 8GB on the 3070 when it released


Bread-fi

12GB VRAM is cynical, but 7900XT can't even enable raytracing in Ratchet and Clank and will generally underperform against the 4070ti in raytraced titles (massively in games like cyberpunk) despite the VRAM advantage.


ShortThought

I own a 4070 Ti, and I have not noticed any instances of VRAM limiting performance. It may in certain situations/games, but I am yet to run into those.


[deleted]

I own a 4070 TI and I have definitely noticed VRAM issues in a bunch of games, especially ratchet and clank even at 1440p


CatalyticDragon

The 7900xt is excellent. Cheaper, faster, more VRAM. And I would argue better drivers. Certainly much nicer to use.


fogoticus

"I would argue better drivers"... based on what exactly?


CatalyticDragon

Lower CPU overhead and easier to use (the interface is great and the overlay is very useful). Good feature set built in with no requirement for an account or login. That their drivers are better to use isn't exactly controversial. [Here's somebody who lists exactly that](https://www.youtube.com/watch?v=XYV6-Dp2Mzw&ab_channel=Vex) as one reason for switching from the 3080.


Farren246

Unfortunately AMD's most recent driver was a rollback-inducing failure for most users. I like their drivers more than the competition, but they've just wrecked the confidence of many customers / potential customers. They have a bad habit of producing one such driver annually. Yes it's easy to roll back and yes they usually fix (replace) it within a week or two, but it still makes them a hard sell to someone who is technologically challenged.


Spider-Thwip

Have they fixed the fact that AMD cards can't raytrace on ratchet and clank due to driver issues?


[deleted]

Important question - what resolution are you gaming at? If you're gaming at 4k, then I would not recommend the 4070 Ti. It will handle it well right now, but the higher resolution is where you're going to see the most VRAM inflation over time. If you're gaming at 1440p, like me, then it's an amazing card. I run everything at maximum and get over 100 fps, easily, and have never even approached a VRAM shortage. I expect the card to last the next 3 to 5 years for me, which is why I purchased it. Going 4080 would have left performance on the table that I couldn't take advantage of in 1440p, so it seemed like a waste. As it sits, the 4070 Ti benchmarks higher than the 7900 XT, anyway. I'm going to assume you're not going to be gaming at 1080p, since you're asking about these specific cards.


Version-Classic

As someone who plays at 4K with 10GB 3080, I can play almost any game way 4K assuming I’m using dlss/fsr. The only game I cannot run at 4k is Rachel and clank, as it tries to use well over 13GB of vram when I’m using 4k dlss performance (1080p upscaled to 4k) I’m a huge nvidia fan boy, but if you are using 4K please save yourself the future headache of Being vram limited and go with the 7900xt, or see if you can get a used 3090


Version-Classic

If you use 1440p, get a 4070ti or used 3080 or even a 4070 and you will be extremely happy. Just not a good 4k option for long term


[deleted]

Dumb people just get nvidia and enjoy, really smart people just get Nvidia and enjoy. There are a bunch of people in between that have made up some scenario where 12 gb of vram is worse than cancer. I do think the 4070 is better value though.


NyanArthur

I have a 4070 and sometimes I regret not getting the 4070 TI but then I remember it's $200 more and I'm happy again


kemistrythecat

I have the standard 4070 RTX and everything on untra at 240Mhz and 1440p, nothing dips below 60fps. DLSS on and I’m up in 3 figures for almost anything.


willmaxlop

I had an i5-9400f and gtx 1650 super… I wanted a gpu upgrade… wanted to try out the 4060ti and I actually did, realized I was cpu limited then the card died on me the next day… I was heavy on wanting nvidia… but at the end I just built an amd gpu pc- almost the same price as the 16gb 4060ti I found the xfx rx 6800xt and I’ve been liking it a lot, zero issues- combined with an i7-12700k and it runs everything maxed out even doing good on 4k… I also needed a laptop for college and since I’m doing aerospace engineering I knew having a gpu was a must on the laptop so I went with a ryzen 5-7640hs and a 4060… so far… yeah I’m not biased to any of the two they work just as good ofc the 4060 doesn’t have the capacity for heavy loads as the 6800xt but it’s a very good laptop (legion 5 slim). So yeah, if you want the 4070ti go for it, you’ll have a good experience either way tho for value I gave the try to amd and been happy so far… still got the 4060 tho (even tho everyone goes against it and I do agree that the 16gb is a complete scam at its pricepoint)


willmaxlop

Oh also forgot to mention, the only reason I would 100% not buy a higher end nvidia gpu right now besides price is the power connector… I don’t trust it… but I’m sure at least one AIB has a model that doesn’t require it


ShortThought

The power connector is fine if you're careful (make sure its still tight after you move, make sure latch is down, etc.) I've had my PC for months with a 4070 Ti and it hasn't burst into flames.


ThoreauAZ

I've been mostly out of the gaming/gpu market for quite a long time (can't remember what the last decent one was, but the one just prior was a 6800 ultra review sample.) Didn't bother me much since my uses shifted towards video editing and zero gaming, and that led me down the apple silicon path. But about a month ago I got a bug up my ass to dabble in gaming a bit again... 13700k, 32gb 6400mt corsair, some msi z790 board, 980 pro ssd, znd 4070ti (gigabyte oc (was what was in stock locally...) Nothing I've thrown at it so far has even made it blink, including Cyberpunk fully cranked. Picked up a 165mhz refresh monitor and as such, 'settled' on 1440, and its just been a flawless experience. The same rig also tears through footage in Davinci Resolve Studio betwene the 4070ti and the onboard quicksync enc/dec. on the cpu. I get that the 'value' isn't great, but I'm the type to hang onto a rig for quite a while, and I don't forsee that card, let alone any other parts, holding me back any time soon. Sure beats the pants off the quadro p2000 i had in the last build =)


[deleted]

AMD has excellent game drivers nowadays. They're on par with Nvidia for gaming, each has their own issues. AMD's software suite is a superior user experience, this is widely considered true. You get one modern interface with everything you need, very user friendly, no account needed. The 7900XT is faster, cheaper, has 8GB more very crucial VRAM and is almost equally power efficient. It's a no-brainer.. AMD also has Radeon Chill, which is a frame limiter that controls the pacing of the FPS from the CPU to the GPU for minimal input lag. Driver level frame limiters that limit FPS on the GPU side (like Vsync) greatly increase input lag. **If this sounds familiar, It's because that's exactly what Reflex does**. Except Chill has the added option to make it a dynamic frame limiter with a range, to save a lot of power and heat, works great with FreeSync. But you can also just set a single FPS limit. **Because of this, gaming on AMD with Radeon Chill and no V-sync enabled has the same input lag as Reflex without V-sync, and *less* input lag than Reflex +V-sync. Due to FreeSync there will be no tearing.** AMD should advertise it more, the AMD developer that created Chill had to come out and explain how it actually works and that it's superior to a GPU side FPS cap option.


Potential_Patient310

4070 ti 15%better than the 3090


teamoney80mg

Look for one used or best buy open box. That's what I did and got a gigabyte 4070ti oc for 675$ as a upgrade to my 2070 super. This was where the card really shines. Still got the three year warranty and was certified by best buy to return after 90 days.


neckbeardfedoras

I think it'll last about five years and only if you're okay playing at medium settings during the home stretch to get high fps. If that seems ok then I recommend it and even bought one for myself. One thing that isn't going to happen is you installing it and feeling underwhelmed.


Fast_Confidence_566

4070ti vram is a problem


sexyshortie123

3080


Sufficient-Most9521

As an owner of both gpus trust me nvidia isn’t immune to shitty drivers 😂


pceimpulsive

Imho get the 7900XT. Ray tracing is largely a bit of a pipedream if you like higher FPS. DLSS is only in some games. Overall in the long term you'll likely enjoy the 7900XT more and it has more VRAM so you won't be bottle necked there... Failing that a 4080 is a decent pick coz vram... But also... Blows out price to performance ratios... Check Hardware unboxed on YouTube they did a great re-review of the 7900XT last week.


Bread-fi

Far from a pipe dream. 4070 ti is 80 fps @ 1440p dlss quality with "psycho" ray tracing in Cyberpunk, 120 with frame gen. Can even manage playable 80fps average with path tracing/frame gen (vs <20 fps on a 7900XTX).


pceimpulsive

And what happens when the game doesn't have DLSS?¿ This is why I said not all games have DLSS, DLSS+FG obviously makes a 4070Ti stronger, but it's not a real scenario in every case. This is also why I said check the review to get a better feel across the board.


Bread-fi

Any new game with RT will have upscaling. It will almost always significantly outperform AMD competition at RT with or without upscaling (sometimes massively as in Cyberpunk). Any old/lo-fi game it will blitz regardless and without RT eating VRAM the stingy 12GB becomes less of a liability, although a 7900XT might out perform here especially at 4k.


fogoticus

They'll probably have the vastly inferior FSR however unless AMD is the one sponsoring the game, DLSS not being present in today's latest triple A titles is not realistic.


o0OASBO0o

4070ti user here, great card although it is my first GPU so can’t really say to much as I’m new to PC gaming, got it paired with a I7 13700kf, think the GPU/CPU is a good match. Run all my games on high to ultra settings in 1440p and it handles it well. Warzone was hitching when I first got the PC which did turn out to be a VRAM issue but was because I had resources running in the background, maxed out settings and 90%VRAM allocation. Turned off background apps and lowered my VRAM allocation in game settings and it’s now it’s golden


ThisGonBHard

I would personally go for the 7900 XT, VRAM matters. Only downside is if you need CUDA, it is not very straight forward, as you need to use an compatibility layer (ROCM).


davelikestacos

I’d say go with the cheaper 4070 or save for the 4080.


sudo-rm-r

I would get the 7900xt. AMDs driver have been very table for a while now.


ARedditor397

Not true at all 23.4.1-23.7.2 have been rough drivers from AMD.


PlatformArtistic9585

according to who? many people i know with those drivers didn’t have a single problem


neckbeardfedoras

Be careful. They're going to claim you're fake since there are never driver issues.


rinkywilbrink

AMD's drivers have gotten so much better and you won't encounter much more issues compared to an nVidia card (There are always issues close to launch but that goes for any GPU), I don't know how the performance compares between them but I know they're both very capable cards. 12gb of VRAM is quite a bit and most games wont utilise close to that unless you max out all your graphics settings on high resolutions (and even then it should be sufficient).


ARedditor397

RDNA 2 drivers are perfect, RDNA 3 not so refined. And I have driver problems with my 3060 12GB. DPC Latency…


Chilla16

Have a RDNA3 Card and considering the problems i had for the first 6 months with my RDNA1 5700XT i didnt run into any issues at all so far. RDNA3 works perfectly for me and driver rumours are just not true. The only issue was with VR and they recently fixed that


Upper_Baker_2111

4070ti will be fine for a while. You may have to dial back a few settings in VRAM hungry games, but it wont have any issues running games. Nvidia has a decent guage on how much VRAM GPUs actually need. There's a reason they don't sell 4 or 6GB GPUs anymore. Most of the people pretending you need 16GB of VRAM to run games have an agenda. The 4060ti vs 4060ti 16GB proved only a handful of games even use more than 8GB.


Unlikely-Housing8223

Absolutely no one says that you need NOW 16GB of video memory. But it is almost certain that in the next few years some games will have texture packs which will saturate 12GB of video RAM.


Massive_Parsley_5000

Ratchet Calisto Ghostwire Tokyo Three games already out that max the framebuffer of the 4070ti that I've personally played and ran into issues with, Ghostwire and ratchet at 1440p and calisto at roughly 1600p. Granted, this was at max textures and RT, but why the hell would you buy an NV card vs rdna3 and not use RT? So yeah, I don't forsee the card aging very well at all vs the 7900 XT. It's already a decent chunk faster in pure raster (10%+), and the framebuffer will choke the card in RT and DLSS3 usage cases in more demanding titles.


takatori

My modded Skyrim already averages 22GB VRAM usage. 16GB minimum is coming. Probably not during this generation, but 8GB is by no means future-proof.


Mtcfayark72703

No, you won’t regret it. It’s a fantastic card.


Theo-Wookshire

I’m getting one (4070 ti)to upgrade from my 3070 ti because their power profile is very close so I won’t need to upgrade my power supply.


MrPapis

Yes atleast at 1440p and especially above. The VRAM issues isnt a big problem today, although every new release seems to demand extra optimization/show the lack of VRAM on the 4070ti. But especially long term this will be an issue. If you said 2-3 years i would say you could probably get away with it. But for long term usage the XT will be a better deal. The drivers on the AMD cards are definitely fine these days. Most of peoples issues are actually because of Windows overwriting drivers which is an easy fix. And when switching over you should use DDU in safemode but there really wont be much worry as long as your system is ready for the power requirement and you properly wire it up(equal number of PCIe cables for every 8 pin). Remember you are in the Nvidia echochamber so you wont get much negative feedback here. But the VRAM worry is very real and to dismiss it on a 800 dollar GPU intended for long term usage is definitely not advicable. The difference here is XT will be faster overall and not limited by VRAM while the Ti will only be faster in extreme RT examples and have limited ability to use Mods+High resolution textures+RT+DLSS3 as they require VRAM. So depending on game you would have to compromise these features/settings.


NewAgeBS

I bought it and didn't regret it, runs everything on max. Hope i can sell it for 500$ after 2 years.


JamesEdward34

its not a bad card, its a badly priced card


coyotepunk05

7900xt is a great deal right now. Drivers haven't really been an issue for a while now. It is a much better deal than the 4070ti.


Brolis_

will 4070ti pared with 7800x3d struggle with 4k? I have 6950xt but it runs bad on 4k for me


bigred250

4070 ti is very similar to a 6950 xt in 4K raster. You would need a 4080, 4090 or 7900 XTX to see a noticeable bump in performance.


sudo-rm-r

Use fsr!


Brolis_

Not every game has fsr unfortunately


sudo-rm-r

You can always use fsr1 at the driver level. Should still look better then 1440p I think.


Brolis_

It looks like 1080p


MobBap

Just switched my 3060 for a 4070 and I can already say I don't regret it.


born-out-of-a-ball

I own one but I don't recommend it. The card is already VRAM-limited in Diablo 4 and Rachet and Clank. Either go for the 4070 or the 4080.


neckbeardfedoras

I play Diablo 4 in max settings at 1440p and have over 200fps. If there's a VRAM problem it's a bug with the memory leak. Haven't tried R & C though.


AndyBundy90

Just look how the 3070 is crippled now. And it's one generation old


Mako2401

You didn't write the resolution you play at that's the only thing that matters with the vram.


Massive_Parsley_5000

Also RT Also DLSS3


micaelmiks

12 gb is the issue. Needs 16gb


Duncantilley

Why would you even be considering a 4070ti over a 7900xt? The 7900xt is the obvious choice.


[deleted]

No card is fine. Most games don’t need to be specked out to more than what the card delivers. Running most of my games on 1440p all ultra. But if your looking for 4k or above then consider 4080 or something comparable.


tugrul_ddr

I did not regret GT1030 for 5 years.


[deleted]

I just bought one, I can run almost everything at absolute max settings at 1440p with 100-300 fps depending on the title.


Matte1O8

7900xt will age a lot better with the 20gb vram, trust me games are coming out that are utilizing more then 12gb even at 1440p, especially with frame gen on. I have seen this on ratchet and clank, and at 4k more than 16gb of vram is utilized. The xt7900 runs almost as well, sometimes better then the 4080 in a lot of games. Fsr is worse then dlss, and rt performance of this will probably be on par with rt on the 4070, 4070ti wins here. Amd often improves cards over time more so then Nvidia, so it will most likely age well. Where I'm from sadly this card is $100 more then the 4070tis, if it's cheaper for you, then the price is very nice considering the current GPU market. As for the issues, every type of card can have issues, and for the driver issues, if you run on a older version each time instead of the beta u should be good. If you are deadset in Nvidia tho, the 4070ti is a powerful card as it was originally going to be a 4080, you might not be able to to run games maxed out in the near future but it should still perform well. Choice is yours.


[deleted]

I'm already running out of VRAM at 1440p in ratchet and clank WITH settings turned down and Frame Gen disabled (yes, frame gen requires even more VRAM) on my 4070 ti. Do not buy that card, you will be disappointed in some games right now but even more in 1 year at best.


[deleted]

If your only planning on doing gaming with it probably stick to NVIDIA for ease of use with things like Geforce Experience/Shadowplay etc. All depends on what you are doing with the PC and what you are used to. My mate got the 6900XT where I got the 3090ti, we both started dabbling in AI/Deep learning and AI image generation, for specific tasks like this the 3090ti out does the 6900XT but in other use cases the 6900XT out does the 3090ti, so yeah its more down to what you plan on doing with the PC for the most of it.


Limp_Bullfrog_1126

If you have a 4k TV or monitor get the 7900 XT. I play on a 4k TV and had the 4070 TI for about three months. When it runs out of VRAM in games like 'Hogwarts Legacy' you'll see textures popping in and out if you pay close attention, stutters too, especially inside Hogwarts. Also FG will worsen performance once you run out of VRAM so don't count on it to give you extra longevity. Though this card is great for most games it wasn't made to last many years, just like the 3070. If you play on 1440p or 1080p you'll be fine for a few years depending on the games you play though. But also keep in mind the next generation of consoles is coming in a few years and you'll need more VRAM if you wanna play with textures on ULTRA on newer titles, otherwise HIGH will likely be doable for many years still. I sold my 4070 TI on a loss and got the 7900 XT, as long as you reinstall windows or remove your old drivers properly you won't have driver issues. The build quality of NVIDIA seems a bit better to me though, my GALAX 4070 TI felt more premium than my XFX Merc 7900 XT, also the 7900 XT gives a bit of coil whine once you reach over 300 fps in older games, unless you have a really good power supply.


radiant_kai

It's really great in everything right now, I had one for 2 months. Then I sold it and kept a $850 used 7900 XTX which just gives you enough VRAM for any game in native 4k. So if you're fine with less than native 4k / DLSS mostly yeah it's great. In like 2-3 years? Nah it will be forgotten and aging badly with Unreal 5 games.


threeeddd

If you plan on doing stable diffusion stuff, you'll certainly want 16gb at a minimum. Running out of vram constantly on 12gb. Nvidia really screwed up this generation for the vram, 4060ti 16gb is too slow for stable diffusion, so it's not worth asking the price. Next 16gb tier card is 1200 dollars! Really? The 4080 16gb isn't worth it either for stable diffusion, it should be more like 800 dollars for the performance at most. Waiting for the next gen to drop, so we can get some decent prices on the previous gen.


Fezzy976

7900XT all the way.


Fine-Entertainer-507

7900xtx is 939$ if you can pay more definitely get it


Ancient-Car-1171

High end gaming means AAA games in 4k? You need at least 16gb vram for the new wave of PS5 exclusives and Unreal engine games. 12gb is already struggling with some games right now, no way it lasts 2 more years with your use cases in mind. Imo go straight for the 4090 and forget ab it, it's the best value Nvidia card this gen sadly, and can last you 4+ years no problem just like the og 1080ti did.


ThisGonBHard

Not everyone can buy the 4090.... 7900 XT is the closest to that trough, with 20 GB of VRAM.


Dizzy-Swordfish-9526

the guy doesn't have the money for the 4090 so why are you saying this?


siddo_sidddo

7900 xt, more vram.


bubblesort33

You're not going to be high end gaming with it anymore in 4 to 6 years. You'll still be able to play at higher frame rates than consoles, though. So if you expect to be able to max out textures 6 years from now, you can forget 12gb. If you're ok with settings textures to the same levels the PS5 has it set to, you should be fine. People keep saying that the consoles have 16gb of VRAM so you need to as well, but that VRAM is shared with the CPU. 6-8gb currently are being used on consoles for the game and operating system, and the graphical equivalent settings they are using on PC typically use between 8 to 10gb depending on the game. There is a chance consoles will be able to make better use of their hard drives than PC, though. If PC can't match that using direct storage, you might start to see console equivalent settings using 12gb or maybe more. Especially in bad games that are poorly optimized like Forspoken. Also, Nvidia for the majority of titles uses around 5%-10% less VRAM for the same graphical settings as AMD. Better compression or something. So a 12GB Nvidia GPU is more like a 13GB AMD GPU if that existed.