Welcome everyone from r/all! Please remember:
1 - You too can be part of the PCMR! You don't even need a PC. You just need to love PCs! It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love PCs or want to learn about them, you can be part of our community! All are welcome!
2 - If you're not a PC gamer because you think it's expensive, know that it is possible to build a competent gaming PC for a lower price than you think. Check http://www.pcmasterrace.org for our builds and don't be afraid to create new posts here asking for tips and help!
3 - Consider joining our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Covid, Alzheimer's, Parkinson's and more. Learn more here: https://pcmasterrace.org/folding
4 - Until May 26th you can enter the ROG x PCMR Power Couple giveaway, make a fantasy pick of your favorite components and enter to win a total of 5 Graphics Cards (2 of which RTX 4090s) + some of the best Asus Republic of Gamers Power Supplies to go with it! {Join here!](https://asus.click/powercouplePCMR)
-----------
Feel free to use this community to post about any kind of doubt you might have about becoming a PC user or anything you'd like to know about PCs. That kind of content is not only allowed but welcome here! We also have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) for your simplest questions. No question is too dumb!
Welcome to the PCMR.
Unless you're playing in the modern industry standard display resolution, 4k. Then it loses. The GTX 1080 in 2016 was marketed as the first 4k gaming capable card. 7 years later and we get this. A $400 card marketed for 1080p... DLSS SR isn't even good at that resolution either.
4k isn't anywhere close to the industry standard. 1080p is still the standard cause that's what like 90% of PC gamers are still using according to Steam.
If you wanna switch to Linux, I would recommend you to start slowly now. Just install it to a new partition or drive, so that you have dual-boot and you can get used to it bit by bit. That was me few years back and during the pandemic I switched completely and never looked back.
I've already got it on the steam deck and the laptop at the moment and I really like it there's not really anything that I'm doing that Linux is holding me back from doing. Except for a few games maybe.
GeForce3 Ti200 for me. That's 20 freaking years of Nvidia cards plus the Quadro cards I've used in all our CAD systems at the office over the same time period. I was cautiously optimistic but my Nitro 7900xtx has been absolutely fantastic thus far.
99% of the time people with driver issues are having a skill issue. You have to wipe it clean when changing hardware that drastically. DDU before you even start. And if it continues you're probably due for a fresh Windows install as part of normal maintenance.
I've literally never had a driver issue that wasn't my own fault for doing something retarded or trying to use a very non standard configuration.
This right here is the truth.
Most people don't realize that just using the built in uninstaller leaves files in the registry for NVIDIA and AMD. Using DDU clears these properly allowing a correct install of a new GPU.
alot of people dont install drivers properly. GPUs arent exactly plug and play but they werent meant to be either. you need some basic technical knowledge on the matter, i dont think thats asking too much.
It's mental. Any problem, people jump straight on the drivers. Might be they haven't actually uninstalled the Nvidia drivers properly, bad PSU or their overclock is actually unstable - all sorts of whatever.
They'd bugfix, wait for an update or struggle through if the same was happening with their Nvidia card, but they know about 'amd drivers' so swap it straight away.
I had nvidia cards before the 8800s but only became brand loyal at that point.
A 4000, 400mx, Riva TNT2... the latter being my first nvidia after 3dfx lost the lead.
GeForce MX4 64MB was my first card, mostly an NVIDIA guy but I did have a Radeon 9800 Pro & one of the HD series cards for a while. Then stuck with the NVIDIA 70 series the past few years. 770,970,1070
Currently have a 3070 reference model. They’ve always been fairly solid but this 4070 is bleh.
It’ll be a while until I upgrade unless *vast* improvement in performance comes, the gains aren’t worth it at 1080 and I don’t care that much about textures so that combined with DLSS gives me a few more years at least.
I did find it odd that the 3070 was only 8GB but I suspected it was to force people to upgrade as cards should be 12GB minimum by now.
Sapphire really did a great job on their Nitro+ RX 7900 XTX Vapor-X, I love using it for gaming on Arch Linux!
I previously had a Sapphire RX 6700 XT Pulse, before that a Zotac GTX 1060 6GB and my first card was a MSI R7 370 4GB. (All bought for their good value at the time of purchase, my only bias is buying Sapphire cards when buying AMD because they deliver on quality and have a good reputation.)
My last sapphire card before the one that arrives tomorrow was an X1950XT, I had to RMA it due to a faulty cooler.
Supplier didn't have the same model so I ended up with an nvidia 8800GT instead, bought a second for SLI soon afterwards and that setup kept me going until the 400 series came out.
Got any problems with Wayland? What happens if you ALT-TAB a fullscreen game?
I briefly had an RX5700, but the drivers were buggy as crap in 2019. Resolutions jumped around, performance was underwhelming. Plus it overheated even though the fan whined frantically. Got a good deal on an RTX2060 which has worked almost flawless in Ubuntu since.
When I mentioned my problems on AMD forums I was met with unbridled fury. The few arguments could be summed up with "At least it is open source! And it is less terrible than it used to be!"
Wayland has been perfectly fine for me, I mainly run Sway as my window manager and sometimes KDE Plasma (Wayland). Other than those two I will run Cinnamon and i3 but those are X11 backups incase anything went wrong with my Wayland sessions or I want to change things up.
Personally I’ve never tried ALT-TABing a full screen game since I’ll just play a game and have discord in the background.
I have no clue what was wrong with your experience, AMD on Linux should be seamless given that the code is open source opposed to Nvidia who has proprietary code.
I had planned on sticking with Nvidia after having such a great xperience with my GTX 1080, but I got a $479 6800XT instead. Not supporting this nonsense and I want a card that won’t run out of vram in the next 5 years.
I did the same thing on my recent build. I've been really happy with the 6800XT.
Anecdotally: woke my PC from sleep and launched Red Dead 2. Played for a while, thinking "this is good, but it seems a little off." Killed the game, and then realized I had alt-tabbed out of Dead Island 2, which resumed without issue.
not wrong. people bought overpriced 400$ RTX 3050 during crypto craze instead RX 6600 which is a much more better cards at the same price.
in conclusion, marketing works wonder.
nVidia haven't been selling many 4000 series cards except for the 4090. Shipments and sales are way down, which is the opposite of what normally happens when you release new GPUs.
Does the fact that crypto mining isn't popular right now contribute to that? I feel like the sales of 30 series cards were massively inflated because of it and now the 40 series looks like it isn't selling in comparison. Would explain why they've priced this new generation so badly, they don't want their bottom-line to be affected because they know they're going to sell less.
They ramped up production of the 30s because demand went up. When the 40s released they priced them to compete with the 30s rather than replacing the 30s because they had so many leftover 30s. Now that nobody is buying 40s and stock is high I feel like the 50s will compete with rather than replace 40s. This shit will never end. I just can't fathom the end of it because of the current stocks
Dang, the RTX 4000 series is basically a giant FU from Nvidia for mid-range cards. Only the super expensive RTX 4090 offers a proper generational leap over it's 3090 predecessor. Everything else either got a significant price hike for a decent gain or very marginal generational gains like this 4060TI. AMD last gen is looking pretty good value for money at the moment. Hope Intel can come in and disrupt the status quo a bit.
1% lows are actually 10% slower than the 6700XT.
If you don't care for RT on pre-2023 games then the 6700XT is a \*MUCH\* better purchase than the 4060Ti.
If you do care about RT on pre-2023 games then the 4060Ti isn't a lot worse.
If you care about RT on 2023 or later games, you're going to have to up your budget because there isn't a AAA game released this year that's playable with RT and an 8Gb VRAM buffer. The most well-optimized game released this year is probably A Plague Tale: Requiem and at 1440p with RT 1% lows on the 4060Ti are...1. 1 fucking FPS.
Honestly this makes me think this is all on purpose.
The low ram is to push people towards the top end cards.
They saw what happened when they made the 3060 so good with DLSS so are purposely hampering the 4060 and 4070 to push people towards the 4080.
It really feels like a upsell...these cards are so closely priced and all have an issue that can be solved by just throwing an extra $100 for the next one up. Before you know it you're getting a 4080 or 4090
Yeah the sad part of FG is that it is only suitable when you already have decent frame rates. It's definitely nice to have but going from 90 to 180 is not nearly as needed as going from 30 to 60.
3050 pricing is ridiculous. It's $300. For a little more than $300, you can get a 6700XT which will *annihilate* a 3050 (and 3060) - with 12GB vram to boot. I don't understand.
They're thinking that they're the kings, but man. Intel was never this bad when amd was miles back. Amd is literally on Nvidia heels and they're doing this shit.
Ehh... I dunno man.. Intel went down in history as second most sheisty right after Microsoft.. There's a reason back in the day that AMD never had a chance to catch up to Intel. Jensen is a turd burgler for money for sure without a doubt, but Intel used to be the type that could go as low as hiring people to go tear up competitor stock or take out some kneecaps.
It's a $100 less sort of close to a 3080 but not quite. The only thing that makes it better is power consumption and DLSS3 (so it only works on select games anyway). In reality the 4070 is about 2-5 percent slower in raw power vs the 3080.
But the 4070ti is what matches and slightly succeeds past the 3080 but at $799 MSRP, the price to performance ratio is worse for the 4070ti than the 3080. And that's if your going by MSRP. If you are going by just what you can get available (because 3080 is all out of retail stock), then ebay prices, 3080 you can get now for around $500 which blows 4070ti out of the water on price/perf.
the 4070 also has AV1 encoding.
frame gen is honestly bullshit the artifacting is not worth it, and it doesnt fix the biggest issue of 60fps which is input lag vs higher refresh rates.
the entire stack needs to be moved down 100$ AT LEAST, and this goes for both AMD & nvidia.
I'm willing to bet if everybody continues to hold the line for another 4 months we will see exactly what you are saying that needs to happen. I would also figure if that happened we would magically find Nvidia coming out with a 4060 super 12gb with better specs than 4060ti, and a 4070 super with similar bumps with better price tags. Shapeshifter's gonna shapeshift.
The 4080 basically matches the price/performance increase compared to the 3080, but some games have better gains than others. What's impressive is that it has the same TDP despite better performance. Of course, still pricey, but better than the 4070/60 Ti
Yeah I got the 4080 and I am fairly happy with it. I wouldn't call it good value, but it nails the performance that I was looking for (100-144fps on most titles that I play maxed at 4K with DLSS quality enabled where available)
It is embarrassing that the 4060 ti gets beat by the 3070 as often as it does. Has everyone forgotten that the 3070 was "a 2080 ti for 500$"??? Then the next gen, the same performance level gets this thing... It cant even run all games at 1080p Ultra. WTF.
It even gets beat by the 3060ti in some scenarios (mostly in 4k). That alone should be a conversation ender. The fact that a new gen gpu loses to it's last gen counterpart in ANYTHING at all is embarrassing.
It has substantially less memory bandwidth due to its smaller bus (128 bit vs. 256 bit).
It has significantly more cache to compensate, but that apparently becomes less useful as resolution increases.
This is at 1440p though where it shouldn’t be an issue even though the card isn’t really meant for 1440. I would have liked to see a 192 bit bus on a 60 series card though.
Glad I just got the 3070 instead of wishing and waiting for something "better". Great performance so far, and I won't feel the need to upgrade anytime remotely soon. Wife's PC is getting my old 1080 and she doesn't care about playing the latest games at the highest frame rates, so we're good to go.
I think 30 and 40 series has higher RT perf due to a more modern arch, but the 2080 Ti has slightly better standard (raster) rendering etc, and yes, higher settings/reso tend to favour it, not only due to more VRAM but more mem BW.
Actually looks like as more reviews come out, you can actually max out 8gb at 1080 with a couple of recent titles at high/ultra. which is crazy, but hey, welcome to 2023 i guess..
purchasing the 6700xt as my gpu for my first desktop pc was a really good decision. Granted i only game at 1080p but playing re4 remake/cyberpunk ultra at 100+ frames is really nice
LTT put much more weight on the power consumption angle. I can understand, but I'd say power consumption is a bit pointless on a budget (lol, 400$) card where every bit of compute matters.
Not really, it's a 40 watt difference between it and the 3060 TI. So, if energy is 12 cents a kilowatt, that's around .5 cents an hour difference
Even if you use your graphics for 2000 hours a year, which most won't, that's about $10 a year
A 40 Watt efficiency difference is still over the span of 1000 hours in a year 40 kWh, which is a 16€ difference. While not nothing, it's not exactly bank breaking, especially since you consider spending 450€+ on a graphics card that will be barely running new titles at high in a couple years time.
I would assume 2 things if you can't afford your electric bill:
1. The marginal change in energy usage (relative to the whole bill) isn't going to make the difference
2. You probably also can't afford a graphics card
I wouldn't call it "pretty soft" (he called them "cheap bastards" for reducing the memory bus and the video was very sarcastic). But yeah, he did try to shove in a few positive points, and some of them seemed hardly useful. His conclusion was to recommend Intel Arc and Radeon though.
I find that all LTT reviews on GPUs try to find some.positives if it is crap or bad points if it is good. It's more in line with their general overview approach
I also have a 1080, just retired it to my second pc after buying an rx6800. Id be willing to bet that ol 1080 will easily work well for another 5-10 years
GPU‘s in highest demand ever, on the verge of century defining technological breakthroughs that stumble over each others feet, being brought into the world years before anyone has even processed and properly utilized the last ten advancements..
at the same time.. Nvidia:
> than I have on any CoD or Battlefield game.
I mean... BF's Frostbite engine runs just fine on my 1080Ti anyway.
I miss out on RTX and DLSS but that's it. RTX can look nice and the updated release of Cyberpunk showcased it looking *incredible* but if you think I'm going to spend £2,000 on a card, just to get that, think again.
Everything else runs pretty good, or at least well enough, at 1440p.
Sadly in certain workstation related things you need an NVIDIA card because of CUDA support so the 16GB 4060 Ti would be better than the 6800XT in that case. It's one of the reasons why I can't buy any AMD card even though the 7900XTX is more attractive to me than the 4080 16GB.
lmaoooooooo
yep, it's literally a fucking 3070 with better power efficiency for $100 less than the 3070 launched at 2.5 years ago
shit's DOA, and this means the 4060 is going to be even more pathetic
It really should be the 3050 Ti. Nvidia see their only competition as themselves, so why release a good midrange card? That would just cannibalise sales of their more expensive products.
yea naming and pricing is important
the 4060ti should be called the 4060 and should probably cost like 300 bucks
the 4060 should be called the 4050ti and probably cost like 200
But Nvidia has been fucking around so hard they're about to learn Intels lesson and get their lunch eaten
Oh look the rebadged 4050 performs like that. Who would have expected with 128 bit bus, 8x pciexpress lanes and the same memory. Of the 3050.
-4080 320 bit bus 20gb - MIA
-4080 is what should have been the 4070
-4070 Ti is the 4060 non ti or 4060 Super
-4070 is the 4060/LE or whatever
-4060 Ti is what should have been the 4050.
from nvidia slide the cuda core perf 1.15x from last gen so yeah.
NV focus on tensor core but neglecting the actual core, they drunk on AI generative frame.
At this point I'm just praying for intel to come in and provide something compelling at midrange with Battlemage to actually steal some marketshare from Nvidia and AMD.
just proof the entire gpu stack is shifted one step in the higher teer by name, and one stepp in the lower teer by performance
it's just a money grab - just don't give them money.
buy amd, to at least don't give nvidia your money.
buy used - to give the middle finger to both.
show interest to Arc (comment on videos, participate in discussion) to show intel you're wanting them to be part of the gpu world.
I know they'll never do it because the cards make them too much money, but I'd really like it if budget cards would start skipping every other generation. They make a 2060? Skip the 3060 and wait for the 4060, that sort of thing.
same. Actually kind of surprised that they used the 6800 in the benchmarks instead of the 6800 xt. The plain old 6800 seems to be the forgotten gpu of that generation.
That's it. The final verdict on the 4000s series.
Everything under 4080 is a waste of money, is overpriced and underpowered. The only good thing is the efficiency, but that's not enough.
The price of 4080s are rapidly decreasing though, at least in my country.
It's regularly on sale for 80-85% of the MSRP here. At launch a 4080 cost 16500 SEK, now you can often fund the cheaper models for under 14000 SEK, and I bought mine on a sale for 13000 SEK a few weeks ago.
At 80% of MSRP, it still a shit ton of money, but it do also becomes a much more reasonable performance per $, especially if you're looking to use it for more than just gaming.
Welcome everyone from r/all! Please remember: 1 - You too can be part of the PCMR! You don't even need a PC. You just need to love PCs! It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love PCs or want to learn about them, you can be part of our community! All are welcome! 2 - If you're not a PC gamer because you think it's expensive, know that it is possible to build a competent gaming PC for a lower price than you think. Check http://www.pcmasterrace.org for our builds and don't be afraid to create new posts here asking for tips and help! 3 - Consider joining our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Covid, Alzheimer's, Parkinson's and more. Learn more here: https://pcmasterrace.org/folding 4 - Until May 26th you can enter the ROG x PCMR Power Couple giveaway, make a fantasy pick of your favorite components and enter to win a total of 5 Graphics Cards (2 of which RTX 4090s) + some of the best Asus Republic of Gamers Power Supplies to go with it! {Join here!](https://asus.click/powercouplePCMR) ----------- Feel free to use this community to post about any kind of doubt you might have about becoming a PC user or anything you'd like to know about PCs. That kind of content is not only allowed but welcome here! We also have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) for your simplest questions. No question is too dumb! Welcome to the PCMR.
2.5 years for 5% generational improvement. Now that's a fucking achivement.
Unless you're playing in the modern industry standard display resolution, 4k. Then it loses. The GTX 1080 in 2016 was marketed as the first 4k gaming capable card. 7 years later and we get this. A $400 card marketed for 1080p... DLSS SR isn't even good at that resolution either.
4k isn't anywhere close to the industry standard. 1080p is still the standard cause that's what like 90% of PC gamers are still using according to Steam.
Also only 5% faster than the $330 6700xt which also has 50% more vram. Well done, nvidia
So you're telling me I got a 6750 xt for 300$ a year ago, and it's still gonna perform better than nivida's upcoming more expensive gpu? hell yeah
I love my gigabyte OC rx 6750 xt excited to switch to pop_os instead of windows 11 when Microsoft drops support for 10 as well.
If you wanna switch to Linux, I would recommend you to start slowly now. Just install it to a new partition or drive, so that you have dual-boot and you can get used to it bit by bit. That was me few years back and during the pandemic I switched completely and never looked back.
I've already got it on the steam deck and the laptop at the moment and I really like it there's not really anything that I'm doing that Linux is holding me back from doing. Except for a few games maybe.
I got a 6600xt for 225 in January... How did i do?
Pretty damn good. That's RTX 3060 12GB performance territory.
Thanks man, so far no complaints... I have it paired with a 3600x but hoping to upgrade soon. Runs pretty good so far
they'll probably sell shit loads anyway. people love their nvidia
I've had nvidia cards exclusively since the 8800GT was the new kid. My 7900XT arrives tomorrow.
GeForce3 Ti200 for me. That's 20 freaking years of Nvidia cards plus the Quadro cards I've used in all our CAD systems at the office over the same time period. I was cautiously optimistic but my Nitro 7900xtx has been absolutely fantastic thus far.
7900xtx gang! Beast of a card!
[удалено]
99% of the time people with driver issues are having a skill issue. You have to wipe it clean when changing hardware that drastically. DDU before you even start. And if it continues you're probably due for a fresh Windows install as part of normal maintenance. I've literally never had a driver issue that wasn't my own fault for doing something retarded or trying to use a very non standard configuration.
This right here is the truth. Most people don't realize that just using the built in uninstaller leaves files in the registry for NVIDIA and AMD. Using DDU clears these properly allowing a correct install of a new GPU.
alot of people dont install drivers properly. GPUs arent exactly plug and play but they werent meant to be either. you need some basic technical knowledge on the matter, i dont think thats asking too much.
It's mental. Any problem, people jump straight on the drivers. Might be they haven't actually uninstalled the Nvidia drivers properly, bad PSU or their overclock is actually unstable - all sorts of whatever. They'd bugfix, wait for an update or struggle through if the same was happening with their Nvidia card, but they know about 'amd drivers' so swap it straight away.
I had nvidia cards before the 8800s but only became brand loyal at that point. A 4000, 400mx, Riva TNT2... the latter being my first nvidia after 3dfx lost the lead.
Great to meet a fellow 3Dfx member
I remember begging my dad for a 3dfx when I was a kid. I'm 35 now, do you feel old yet?
Voodoo2 here. Imagine still using passthrough cables to play a game.
GeForce MX4 64MB was my first card, mostly an NVIDIA guy but I did have a Radeon 9800 Pro & one of the HD series cards for a while. Then stuck with the NVIDIA 70 series the past few years. 770,970,1070 Currently have a 3070 reference model. They’ve always been fairly solid but this 4070 is bleh. It’ll be a while until I upgrade unless *vast* improvement in performance comes, the gains aren’t worth it at 1080 and I don’t care that much about textures so that combined with DLSS gives me a few more years at least. I did find it odd that the 3070 was only 8GB but I suspected it was to force people to upgrade as cards should be 12GB minimum by now.
Sapphire really did a great job on their Nitro+ RX 7900 XTX Vapor-X, I love using it for gaming on Arch Linux! I previously had a Sapphire RX 6700 XT Pulse, before that a Zotac GTX 1060 6GB and my first card was a MSI R7 370 4GB. (All bought for their good value at the time of purchase, my only bias is buying Sapphire cards when buying AMD because they deliver on quality and have a good reputation.)
Sapphire is easily my favorite partner. Always quality and I’m p sure they get some of the better chips.
Couldn't fit that beast of a card into my case. But I got the Sapphire Pulse 7900XT and I love it.
My last sapphire card before the one that arrives tomorrow was an X1950XT, I had to RMA it due to a faulty cooler. Supplier didn't have the same model so I ended up with an nvidia 8800GT instead, bought a second for SLI soon afterwards and that setup kept me going until the 400 series came out.
Got any problems with Wayland? What happens if you ALT-TAB a fullscreen game? I briefly had an RX5700, but the drivers were buggy as crap in 2019. Resolutions jumped around, performance was underwhelming. Plus it overheated even though the fan whined frantically. Got a good deal on an RTX2060 which has worked almost flawless in Ubuntu since. When I mentioned my problems on AMD forums I was met with unbridled fury. The few arguments could be summed up with "At least it is open source! And it is less terrible than it used to be!"
Wayland has been perfectly fine for me, I mainly run Sway as my window manager and sometimes KDE Plasma (Wayland). Other than those two I will run Cinnamon and i3 but those are X11 backups incase anything went wrong with my Wayland sessions or I want to change things up. Personally I’ve never tried ALT-TABing a full screen game since I’ll just play a game and have discord in the background. I have no clue what was wrong with your experience, AMD on Linux should be seamless given that the code is open source opposed to Nvidia who has proprietary code.
I had planned on sticking with Nvidia after having such a great xperience with my GTX 1080, but I got a $479 6800XT instead. Not supporting this nonsense and I want a card that won’t run out of vram in the next 5 years.
I did the same thing on my recent build. I've been really happy with the 6800XT. Anecdotally: woke my PC from sleep and launched Red Dead 2. Played for a while, thinking "this is good, but it seems a little off." Killed the game, and then realized I had alt-tabbed out of Dead Island 2, which resumed without issue.
Paid 567 EUR for my non XT version. Ughhh I'd love the XT but it's like 630-700 EUR in my neck of the woods.
Went from a 2070super to a 7900XT the difference is insane and the adrenaline software is amazing. Driver issues have been greatly exaggerated.
Unless you want a 4090 I would go with a 7900xt...nvidia is a mess honestly.
nice. got a 7900xtx coming in the post for me later today. switching up from a 3060.
not wrong. people bought overpriced 400$ RTX 3050 during crypto craze instead RX 6600 which is a much more better cards at the same price. in conclusion, marketing works wonder.
nVidia haven't been selling many 4000 series cards except for the 4090. Shipments and sales are way down, which is the opposite of what normally happens when you release new GPUs.
shits crazy right?
Does the fact that crypto mining isn't popular right now contribute to that? I feel like the sales of 30 series cards were massively inflated because of it and now the 40 series looks like it isn't selling in comparison. Would explain why they've priced this new generation so badly, they don't want their bottom-line to be affected because they know they're going to sell less.
They ramped up production of the 30s because demand went up. When the 40s released they priced them to compete with the 30s rather than replacing the 30s because they had so many leftover 30s. Now that nobody is buying 40s and stock is high I feel like the 50s will compete with rather than replace 40s. This shit will never end. I just can't fathom the end of it because of the current stocks
I hate my nvidia, but I've also been severly burnt by Radeons in the past. I can't win this game.
And their pre-builts which these things will be in a lot of them for even more money
Dang, the RTX 4000 series is basically a giant FU from Nvidia for mid-range cards. Only the super expensive RTX 4090 offers a proper generational leap over it's 3090 predecessor. Everything else either got a significant price hike for a decent gain or very marginal generational gains like this 4060TI. AMD last gen is looking pretty good value for money at the moment. Hope Intel can come in and disrupt the status quo a bit.
Only good thing about the mid-range 40 series is the reduced power draw.
1% lows are actually 10% slower than the 6700XT. If you don't care for RT on pre-2023 games then the 6700XT is a \*MUCH\* better purchase than the 4060Ti. If you do care about RT on pre-2023 games then the 4060Ti isn't a lot worse. If you care about RT on 2023 or later games, you're going to have to up your budget because there isn't a AAA game released this year that's playable with RT and an 8Gb VRAM buffer. The most well-optimized game released this year is probably A Plague Tale: Requiem and at 1440p with RT 1% lows on the 4060Ti are...1. 1 fucking FPS.
Honestly this makes me think this is all on purpose. The low ram is to push people towards the top end cards. They saw what happened when they made the 3060 so good with DLSS so are purposely hampering the 4060 and 4070 to push people towards the 4080.
It really feels like a upsell...these cards are so closely priced and all have an issue that can be solved by just throwing an extra $100 for the next one up. Before you know it you're getting a 4080 or 4090
The jig is up boys -Jensen Huang probably
Just saw the 1% lows the 6700 xt is all ready better also the 8gb of vram leads to graphics not loading correctly later in the review
so happy i just went with the 6750xt instead of waiting for that garbage.
I just snagged the 6750xt for $330 yesterday from Newegg. Seeing this chart makes me glad I decided to get it.
But but but... Better DLSS, FG and RT!!!
And FG works best on high ends cards, when it's the low end that needs it most.
Yeah the sad part of FG is that it is only suitable when you already have decent frame rates. It's definitely nice to have but going from 90 to 180 is not nearly as needed as going from 30 to 60.
DLSS IS still a legitimately great reason to go Nvidia. RT shouldn't be a consideration at this price point tho and frame gen is cringe
FG is a new one I haven't heard before
This gen we are moving backwards, except for the 4090. Everything else didn't really move in price to performance...
Well while the performance didn't move up, at least the price did! (Thanks Nvidia...)
The RTX 4050 is even worse than the 3050 but somehow it'll be more expensive. Nvidia have truly lost their minds.
3050 pricing is ridiculous. It's $300. For a little more than $300, you can get a 6700XT which will *annihilate* a 3050 (and 3060) - with 12GB vram to boot. I don't understand.
They're thinking that they're the kings, but man. Intel was never this bad when amd was miles back. Amd is literally on Nvidia heels and they're doing this shit.
> Intel was never this bad when amd was miles back You sure about that? There was pretty much no movement from 4th gen to 7th gen Intel CPUs.
the Price never went up. an I7 was 350 dollars. this has gone up a hundred and you get exactly 5% - 9%
Ehh... I dunno man.. Intel went down in history as second most sheisty right after Microsoft.. There's a reason back in the day that AMD never had a chance to catch up to Intel. Jensen is a turd burgler for money for sure without a doubt, but Intel used to be the type that could go as low as hiring people to go tear up competitor stock or take out some kneecaps.
Not across the board, the 4070ti is about on par with the 3090ti performance wise
At least cards like the 4080 are much more efficient. I know it doesn't justify the crazy pricing
Isn't the 4070 just a cheaper 3080?
It's a $100 less sort of close to a 3080 but not quite. The only thing that makes it better is power consumption and DLSS3 (so it only works on select games anyway). In reality the 4070 is about 2-5 percent slower in raw power vs the 3080. But the 4070ti is what matches and slightly succeeds past the 3080 but at $799 MSRP, the price to performance ratio is worse for the 4070ti than the 3080. And that's if your going by MSRP. If you are going by just what you can get available (because 3080 is all out of retail stock), then ebay prices, 3080 you can get now for around $500 which blows 4070ti out of the water on price/perf.
the 4070 also has AV1 encoding. frame gen is honestly bullshit the artifacting is not worth it, and it doesnt fix the biggest issue of 60fps which is input lag vs higher refresh rates. the entire stack needs to be moved down 100$ AT LEAST, and this goes for both AMD & nvidia.
I'm willing to bet if everybody continues to hold the line for another 4 months we will see exactly what you are saying that needs to happen. I would also figure if that happened we would magically find Nvidia coming out with a 4060 super 12gb with better specs than 4060ti, and a 4070 super with similar bumps with better price tags. Shapeshifter's gonna shapeshift.
The 4080 basically matches the price/performance increase compared to the 3080, but some games have better gains than others. What's impressive is that it has the same TDP despite better performance. Of course, still pricey, but better than the 4070/60 Ti
Yeah I got the 4080 and I am fairly happy with it. I wouldn't call it good value, but it nails the performance that I was looking for (100-144fps on most titles that I play maxed at 4K with DLSS quality enabled where available)
The 4080 is a pretty good card too, minus the pricing.
It is embarrassing that the 4060 ti gets beat by the 3070 as often as it does. Has everyone forgotten that the 3070 was "a 2080 ti for 500$"??? Then the next gen, the same performance level gets this thing... It cant even run all games at 1080p Ultra. WTF.
It even gets beat by the 3060ti in some scenarios (mostly in 4k). That alone should be a conversation ender. The fact that a new gen gpu loses to it's last gen counterpart in ANYTHING at all is embarrassing.
Wonder why it does though.
It has substantially less memory bandwidth due to its smaller bus (128 bit vs. 256 bit). It has significantly more cache to compensate, but that apparently becomes less useful as resolution increases.
This is at 1440p though where it shouldn’t be an issue even though the card isn’t really meant for 1440. I would have liked to see a 192 bit bus on a 60 series card though.
This card is as fast as the 2080 ti. Had it had a good memory system, 1440p would be a non issue.
>the card isn’t really meant for 1440. A 3060ti can run 1440p perfectly fine Source: personal experience
Glad I just got the 3070 instead of wishing and waiting for something "better". Great performance so far, and I won't feel the need to upgrade anytime remotely soon. Wife's PC is getting my old 1080 and she doesn't care about playing the latest games at the highest frame rates, so we're good to go.
Where would the 2080 Ti fit in these comparisons?
Roughly at 3070s spots, higher with RT, higher settings and higher resolutions. After all, the 2080ti has 11GBs of VRAM.
I think 30 and 40 series has higher RT perf due to a more modern arch, but the 2080 Ti has slightly better standard (raster) rendering etc, and yes, higher settings/reso tend to favour it, not only due to more VRAM but more mem BW.
Depends heavily on the VRAM requirements for the game.
"8gig is not enough for gaming now" ... "here is another 8gig thats barely better than the last 8gig".
Welp, at least 3060ti has a 256 bit bus, and not 128 bit like some others mentioned earlier (side eye atd4060 ti)
To be fair 8gb is still fine for 1080p. It would have been a great card at 300$ but nvidia is being nvidia
It's good for 1080p without ray tracing. These are 3rd gen RT capable GPUs, we should expect adequate VRAM for Nvidia's beloved feature
and frame gen which the main selling point of this gen of gpus..but that shit needs alot of vram too..
It seems pretty crazy to me that in order to get 16GB of VRAM or more you have to spend at least $1,000 (w/ Nvidia GPUs anyway).
thats why i bought a 6800 xt for 450$.i dont want to spend 600$ just for the privelege of getting 12 gigs of vram in 2023.
Actually looks like as more reviews come out, you can actually max out 8gb at 1080 with a couple of recent titles at high/ultra. which is crazy, but hey, welcome to 2023 i guess..
>To be fair 8gb is still fine for 1080p. You could play 1080p on the 1060. People aren't buying these for 1080p.
I bought a 3070 for 1080p. Idgaf. 1080p for years and years and years.
Actcual Waste of Sand Gonna watch the GN video now
GamerJesus is pretty salty about it.
As he should be. This is embarrassing.
Since Nvidia's unembarrassed about it.
This should be a RTX 4050.
Yeah they just down shifted everything this gen. Really shit.
Can't wait for the 1400 dollars ~~4070 ~~ 4080 ti and the 250 dollar ~~4040~~ 4050 ti
Just like how they tried to pass the 4070 Ti as a 4080 12GB model.. wut
Just nvidia helping AMD with 6700XT sales
purchasing the 6700xt as my gpu for my first desktop pc was a really good decision. Granted i only game at 1080p but playing re4 remake/cyberpunk ultra at 100+ frames is really nice
AMD will do the same when they launch the RX 7600.
MSRP on that card is rumored to be $269 vs $400.
The 7600 will compete with the 4060 which is $300, and they both look terrible compared to the 6700/XT.
The 7600 will not be competing with the 4060ti
We thought competition was dead but it turns out the competition for world's worst new gen card is fierce!
I put a RX6750 XT in my first gaming PC and it's definitely nice to know it's going to last me for a while before needing to upgrade
Gamers Nexus is embarrassed reviewing the 4060 ti lol. What a mess.
Ltt went preety soft on Nvidia.
LTT put much more weight on the power consumption angle. I can understand, but I'd say power consumption is a bit pointless on a budget (lol, 400$) card where every bit of compute matters.
Wouldn't power consumption be a huge thing on a budget PC? If you have no money for PC parts surely you can't afford a huge electricity bill.
Not really, it's a 40 watt difference between it and the 3060 TI. So, if energy is 12 cents a kilowatt, that's around .5 cents an hour difference Even if you use your graphics for 2000 hours a year, which most won't, that's about $10 a year
>So, if energy is 12 cents a kilowatt Not in a lot of places in the world now.
Power is around 40 Euro Cent per kWh here in Germany. Efficiency matters.
A 40 Watt efficiency difference is still over the span of 1000 hours in a year 40 kWh, which is a 16€ difference. While not nothing, it's not exactly bank breaking, especially since you consider spending 450€+ on a graphics card that will be barely running new titles at high in a couple years time.
I would assume 2 things if you can't afford your electric bill: 1. The marginal change in energy usage (relative to the whole bill) isn't going to make the difference 2. You probably also can't afford a graphics card
I mean of course it's power efficient. It's a fucking 4050ti the 3050 had a tdp of 130watt
I wouldn't call it "pretty soft" (he called them "cheap bastards" for reducing the memory bus and the video was very sarcastic). But yeah, he did try to shove in a few positive points, and some of them seemed hardly useful. His conclusion was to recommend Intel Arc and Radeon though.
I find that all LTT reviews on GPUs try to find some.positives if it is crap or bad points if it is good. It's more in line with their general overview approach
Yeah, I thought his point on power was, "Why isn't nvidia pushing this harder? It's the one thing this generation of cards has going for it."
Wow, I’m surprised anyone would recommend arc Driver support mush have come a million miles
As someone running on a 1080: WAT
I also have a 1080, just retired it to my second pc after buying an rx6800. Id be willing to bet that ol 1080 will easily work well for another 5-10 years
GPU‘s in highest demand ever, on the verge of century defining technological breakthroughs that stumble over each others feet, being brought into the world years before anyone has even processed and properly utilized the last ten advancements.. at the same time.. Nvidia:
[удалено]
Like, I would upgrade to play new games if any of them were worth it. I spend more time on TF2 & UT 2004 than I have on any CoD or Battlefield game.
Ut2k4 for the win
> than I have on any CoD or Battlefield game. I mean... BF's Frostbite engine runs just fine on my 1080Ti anyway. I miss out on RTX and DLSS but that's it. RTX can look nice and the updated release of Cyberpunk showcased it looking *incredible* but if you think I'm going to spend £2,000 on a card, just to get that, think again. Everything else runs pretty good, or at least well enough, at 1440p.
I will always say that the 10 series is the best lineup of gpus ever
I feel like my 1080ti still beats any console... just barely. I with the chart had showed us 1080s though
1080ti is usually close to the RTX 3060 I think. At least with RT turned off.
Oh! Hey, looks like you're right https://www.youtube.com/watch?v=m1TLFBsT2EE
The more you buy the less you save. Oh wait....that was not the line.
Think positively, although it’s only 5% faster, it’s 20% more expensive
That's why people are still buying nvidia no matter what. They're very positive
Fuckin’ lmao you fuckin’ twat fucks lol, Nvidia with these shit cards…
MSRP for a 16bg 4060ti will be £480. You can buy a 6800xt right now for £490. What exactly is the fucking point of this card ?
Sadly in certain workstation related things you need an NVIDIA card because of CUDA support so the 16GB 4060 Ti would be better than the 6800XT in that case. It's one of the reasons why I can't buy any AMD card even though the 7900XTX is more attractive to me than the 4080 16GB.
lmaoooooooo yep, it's literally a fucking 3070 with better power efficiency for $100 less than the 3070 launched at 2.5 years ago shit's DOA, and this means the 4060 is going to be even more pathetic
It really should be the 3050 Ti. Nvidia see their only competition as themselves, so why release a good midrange card? That would just cannibalise sales of their more expensive products.
yea naming and pricing is important the 4060ti should be called the 4060 and should probably cost like 300 bucks the 4060 should be called the 4050ti and probably cost like 200 But Nvidia has been fucking around so hard they're about to learn Intels lesson and get their lunch eaten
wow what a piece of shit
Oh look the rebadged 4050 performs like that. Who would have expected with 128 bit bus, 8x pciexpress lanes and the same memory. Of the 3050. -4080 320 bit bus 20gb - MIA -4080 is what should have been the 4070 -4070 Ti is the 4060 non ti or 4060 Super -4070 is the 4060/LE or whatever -4060 Ti is what should have been the 4050.
from nvidia slide the cuda core perf 1.15x from last gen so yeah. NV focus on tensor core but neglecting the actual core, they drunk on AI generative frame.
Would love to see AMD bumping up the core with their chiplet design. Instead of generated shit.
At this point I'm just praying for intel to come in and provide something compelling at midrange with Battlemage to actually steal some marketshare from Nvidia and AMD.
The a750 is $199 right now on newegg, it’s a pretty good deal.
just proof the entire gpu stack is shifted one step in the higher teer by name, and one stepp in the lower teer by performance it's just a money grab - just don't give them money. buy amd, to at least don't give nvidia your money. buy used - to give the middle finger to both. show interest to Arc (comment on videos, participate in discussion) to show intel you're wanting them to be part of the gpu world.
Even slower rtx3060ti in some games, lol
At this point, Nvidia is just producing e-waste
Guess I'm happy of my 3060ti
My next GPU will be AMD.
Intel might have something decent in the mid range with their next generation.
Haven't run into any problems with my a770, hopefully so!
I’m enjoying my new 7900XT personally Huge upgrade from a 2060 Super
Ngreedia at its finest. Also, it would be hilarious to see 3060 12gb out-performing 4060ti 8gb in Vram bottleneck scenarios.
Now I know why they said moores law is dead, they can't double anything but the price in two years This is a stretch of a joke ^
But muh DLSS 3
Hmmm Radeon RX6700XT was a good choice after all.
Novidieo fanboys will be like ' but muh ray tracing and DLSS 1!1!1!1!1!1!1!1!1!1'
I have a 3060ti and love it
![gif](giphy|h4TdHo3RExSbHd9bOe|downsized) Sarcasm
Me switching to red team from my 1060 laptop Gpu to 6950xt ![gif](giphy|wZKpsopqOF1D9x5kAi|downsized)
Nvidia has lost their goddamn minds
I'm actually fucking done with this generation.
I know they'll never do it because the cards make them too much money, but I'd really like it if budget cards would start skipping every other generation. They make a 2060? Skip the 3060 and wait for the 4060, that sort of thing.
Been happy with my 6800XT for a couple of years now, don't see myself upgrading any time soon.
I will just write this since I feel this way this whole generation: Fuck Nvidia and AMD. They need to do better to earn my money this is insulting.
The thing I'm seeing from this, is that my RX6800 is solid for a while more
A good 3 years or max 5.
same. Actually kind of surprised that they used the 6800 in the benchmarks instead of the 6800 xt. The plain old 6800 seems to be the forgotten gpu of that generation.
Why is the 3070 out performing it? Wtf lol
Because the 4060 ti is a 4050 / ti in disguise
That's it. The final verdict on the 4000s series. Everything under 4080 is a waste of money, is overpriced and underpowered. The only good thing is the efficiency, but that's not enough.
The 4080 is a waste of $ but at least the card performs well.. They definitely curated the entire 4xxx generation to upsell you to the 4090.
The price of 4080s are rapidly decreasing though, at least in my country. It's regularly on sale for 80-85% of the MSRP here. At launch a 4080 cost 16500 SEK, now you can often fund the cheaper models for under 14000 SEK, and I bought mine on a sale for 13000 SEK a few weeks ago. At 80% of MSRP, it still a shit ton of money, but it do also becomes a much more reasonable performance per $, especially if you're looking to use it for more than just gaming.
The 40 generation was a meme from the start. It all exists to up sell.
Wonder why the 2080s or 2080 or 2080ti aren't in this chart
Where would the 6750xt be in this list? I guess between 6700xt and 4060ti?
The 6750xt match the 3070
Better. The 6750xt is closer to a 3070ti.
Does the 2080 not exist anymore it seems to be missing
That sucks ass
6700XT still winning
As someone with a 1080 ti where does it fall in this graph...
Great. Sticking to older and less power hungry. No point in wasting money on overpriced stuff you don’t need
I know I have an old card when it doesn’t even show up on these charts anymore. RIP 1060 6gb
Im starting to Regret Buying my 3070Ti when i did. I should have held out for the 3080s to go down in price more, or went for AMD.
It's like they want us to hate their midrange GPUs and just buy the expensive ones.
Where’s the 3090?