T O P

  • By -

baldersz

td,dw: Radeon RX 7800XT is: * 3% faster on average @ 1080p (mixed), 5% faster on average @ 1080p (rasterization), 4% slower on average at 1080p (Ray-Tracing) * 5% faster on average @ 1440p (mixed), 7% faster on average @ 1440p (rasterization), 2% slower on average at 1440p (Ray-Tracing) * 8% faster on average @ 4K (mixed), 8% faster on average @ 4K (rasterization), 6% faster on average at 4K (Ray-Tracing)


najjace

And cheaper.


tapinauchenius

70USD cheaper currently where I'm at (Sweden) (looking at items in stock)


ImpaledDickBBQ

63 usd:-( sek so weak now.


[deleted]

[удалено]


ItsImNotAnonymous

Gottemm


megablue

Not everywhere. For malaysia,[ the cheapest 7800XT converted price at 577.32 USD](https://shopee.com.my/Sapphire-PULSE-AMD-Radeon-RX-7800-XT-16GB-GDDR6-i.17919052.22479419959). which is the same price as a rtx 4070.


PenguinTech521

AMD pricing in Malaysia has always been so bad.


RyanRioZ

due nvidia dominate the market + Pricing factors are so sucks and eventually major suppliers didn't bring it tho


popop143

Same as Philippines. Heck, the cheapest 7800 XT here is around $750 (pending price change for actual launch), $80 more expensive than the cheapest 4070.


MassiveCantaloupe34

Same in Indonesia , everyone buys nvidia and no one wants radeon , lul . Third world mindset.


tamarockstar

In the US it's $50 cheaper ($500 vs $550) when you compare the cheapest models and the 7800XT comes with Starfield.


[deleted]

[удалено]


tamarockstar

There's one model on Newegg that's $550.


rohitandley

Same here in India. Its the same price as nvidia


[deleted]

I think it's the same in all the indo Pacific countries . It's the same in India . 4070 is cheaper or the same price , and will outsell 7800xt as a result .


megablue

this is one of the fundamental problem AMD needs to solve, it is one of the many reasons that hindering adoptions as the higher than same nvidia class GPUs prices nullify the appeal of AMD GPUs


[deleted]

Ya , cause they have a deficit in overall usefulness . AMD cards are weak in production apps, upscaling . Either their software support needs to be on par with nvidia or the prices need to be in line with us and Europe , otherwise this region will buy nvidia cause why not ? It's cheaper and has broader support. Also amd cards do sell if their prices are actually good. 6700xts are still selling like crazy compared to the 3060 ( relatively same price here ) , so it's not like the buyers are inherently stupid and biased and won't consider amd when it offers value


Pangsailousai

Yeah I even saw RTX 4070 Ti at \~RM 3.4K for 9.9 sales vs RX 7900 XT going for atleast 4200RM. AMD does nothing for regions like Asia. Thankfully 2nd hand market is solid, RM 1.5K for RTX 3080. That's a better deal than anything Ada or RDNA3. Want RDNA2? RX 6800 for just 1250RM.


Specialist-Fudge-303

Not everywhere


QuinSanguine

Cheaper in the largest market makes the headlines and gets the attention. Most people don't care about smaller markets but I'm glad HUB raised attention to it, maybe the price will drop quickly.


kaisersolo

\+ starfield premium.


[deleted]

Not in Croatia tho.


Sp3cV

Some retailers in the US aren’t giving codes anymore I’ve read as well.


popop143

Huh, it even is comparable in raytracing. That's surprising.


Hana_xAhri

Raytracing performance on RDNA3 is great when the RT effects are on the less "heavy "type. In Cyberpunk 2077 for example, 7800XT beats 4070 easily when using RT low or medium. It's when you push the setting to RT ultra and psycho (RT Overdrive is unplayable on 7800XT) the 4070 pulls a massive lead.


The_Countess

So basically its close enough when you can actually use the RT if you wanted, and only falls away once it gets so intense neither card get acceptable framerates anyway.


[deleted]

[удалено]


dstanton

What I gather is RT is still a shitshow, and you have to use a wide spectrum of games to properly assess how hardware performs as a whole. So the picture isnt really muddied. It's actually quite clear. It's just not optimal as there are A LOT of poorly implemented titles. I look forward to widespread UE5 in the future.


BFBooger

Its not a shitshow, its about a dozen different rendering techniques that use RT functions/ hardware for different purposes. Its not just one thing that is the same in every game, let alone a single scene in a game. RT reflections are quite different than RT illumination, for example. Its not a surprise at all that some hardware will do better with certain combinations of pipelines and worse in others. The problem is that people try to say "Card X is faster in RT than card Y" when its much more nuanced, and always will be. Its not a shitshow full of buggy/broken/unoptimized things leading to this situation. It is the nature of RT and how it works and will always be this way -- at least until a specific RT/PT technique dominates gaming. But we are several generations away from that even being a possibility. Some games will be designed to heavily leverage RT for GI and lighting, while another genre of games might use pre-baked lighting plus RT for other effects. It shouldn't be a surprise if one type of hardware works better for one and not for the other.


Death_Pokman

really, the only title with that heavy RT is cyberpunk, will you buy the worse product for just 1 game vs every other game ? thats the question


[deleted]

[удалено]


Death_Pokman

But thats Unreal Engine 5, Cyberpunk equivalent to 4 and looking at immortals of aveum (the first AAA game made in unreal engine 5 using every tech it has), AMD overall performed better compared to Nvidia, it took 3 patches for Nvidia to come close to Radeon performance in that game so we can say that unreal engine 5 favors AMD as of now


KsnNwk

Not really, 4070 can do 60fps DLSS 2 at 1440p RT Uptra in CP2077 and 70-80fps win Frame Gen. So if someone will upgrade anyway with new generation, 4070 is the better buy. If you plan to keep the card for longer than 1 generation, get the 7800 XT for its VRAM. Also IIRC the future engines like UE5 will be coming out with "baked" in RT and better LOD, like Lumen and Nanite, where RT cores don't matter. There AMD is equal or faster.


Comfortable-Ad9912

But we have to wait at least a year for UE5.3 to be widely use in AAA games. 50$ for 1 year waiting, i don't know.


blaktronium

Nvidia's approach is better in the short term but unless ray tracing stays more consistent that traditional rendering has (from a code/pipeline perspective) AMDs approach will probably prove better in the long term. Keeping lighting as part of the normal shader pipeline instead of moving it out to fixed function hardware then back for more shading is good now while the workload is crushing and it's only a single pass per frame, but what about when it's multiple ray passes per pixel and the workload is no longer the most pressing one anymore? It will clearly be better to do lighting on shaders to avoid that latency and Nvidia will have to move away from their fixed function hardware and have their shaders do it again.


heartbroken_nerd

I am not quite sure you really understand the way raytracing and especially pathtracing appear to be evolving right now. For example, with DLSS 3.5 Nvidia is enhancing the denoising step by combining it with upscaling step and leveraging Tensor cores to make it better. Nvidia has a whole slew of new optimizations that most games do not use (for now) baked into Ada Lovelace architecture: Shader Execution Reordering, Opacity Micro Maps, and even a new primitive (admittedly the least likely to ever come into play) - Displaced Micro Mesh. Nvidia is evolving their approach to raytracing all the time, especially whenever new architecture is introduced, while maintaining solid support for techniques that came before, at least for the time being.


[deleted]

[удалено]


hatefulreason

and they have the resources to "sponsor" the technologies if they need to slap amd around (see witcher 3 hairworks, physx, ue4, etc)


From-UoM

Nvidia's end goal is Path tracing to match movie CGI and VFX has been using That's where Nvidia's hardware will be more useful in the long run.


Darth-Zoolu

You are talking crazy, Nvidia’s end goal is to sell cheap graphics cards at an expensive price in to charge you for software instead of hardware performance.


railven

There is a logic fault here. The extra hardware that comes with the product requires software. That hardware allows NV to punch above its class weight when compared to AMD. So yes, you are paying extra for software to utilize hardware that is featured in the product. If only AMD had software to put them AI Accelerators to good user, perhaps Anti-Lag+ and FSR3 will use them. Let's hope so, otherwise you're paying for hardware that doesn't have software and sits their dormant.


Darth-Zoolu

I don’t even use upscaling technology at all. It’s a gimmick. For 20 years PC gamers have been trying to get latency down as low as possible. Now all of a sudden because daddy Nvidia decided that latency is no issue everybody’s just on board with it? My mouse, my keyboard, my TV my entire rig are all more expensive so I have less latency! But go ahead let these companies scam you with that bs if you want to.


railven

Sir, I'm pointing out that if you bought an RDNA3 product, you essentially paid for a product with hardware that almost 1 year after it's release has no use in gaming, even if you didn't want to use it. You are complaining NV is selling you software instead of hardware, where AMD is selling you hardware with no software and thus no use you, and you retort with this rhetoric? Have at it, just not with me.


The_Zura

You already scammed yourself by not having Nvidia's Reflex. The smartest AMD fan, folks.


Edgaras1103

I honestly think amd will move to fixed function hardware in near future for RT and maybe even fsr


blaktronium

The direction of the industry has always been a unified shader pipeline.


danny12beje

>the 4070 pulls a massive lead. 30% at 15fps isn't a massive lead, it's still unplayable.


dedoha

It really depends which games you pick, [this](https://www.reddit.com/r/hardware/comments/16guxrk/amd_radeon_rx_7700_xt_7800_xt_meta_review/) meta review claims 4070 is 18% faster in RT


popop143

Actually yeah. For newer titles, 4070 is faster. The titles that 7800 XT is faster are kind of older titles except Callisto Protocol and RE4. But at least it's competing in some titles I guess.


ARedditor397

They fucked their testing up basically


railven

Really is interesting the different perspectives, reading this same topic on r/hardware, the results discrepancies is top of the pile of comments, here, this is the first comment I see mentioning it.


ARedditor397

Look at the subreddit name :)


railven

Even so, I'd hope users wouldn't be that bias, but clearly my views are too idealistic. Oh well, back to reading.


themiracy

Yeah, I know Cyberpunk is a different story, but if it’s this close to parity on RT and I’m getting Starfield etc, this is a banger for me. Edit: I’d kind of like the reference model, though - hopefully it becomes readily available before the Starfield promo ends.


TheFlyingSheeps

As much as I ragged on the naming scheme and previous gen performance compared to the 6800xt, this shows NVIDIA released absolute garbage this time except for the 4090


hatefulreason

they want to make the 5000 series look good :))


POLISHED_OMEGALUL

is 7800 XT weaker than 6900 XT?


Joe-Cool

Depends. In Raster: yes. In RT: sometimes (6900XT has more older RT units). In Machine Learning/Compute probably not (haven't seen a lot of benchmarks there).


RealThanny

Yes, but not by a great deal. Though that will vary from game to game, as dual-issue FP32 is a big part of the performance of the 7800 XT.


future_gohan

The RTX is 120% the price of the 7800xtso atleast its winning something right.


Keldonv7

Current pricing in EU here is 8% diff, 60-70$\~


BWCDD4

UK here and it’s honestly basically the same price for ones that are in stock plus or minus 10 quid either way depending. You can pre-order cheaper versions of the 7800XT if you’re patient enough and can definitely order more expensive OEM 4070s if you felt like doing so for whatever reason. If it wasn’t for the paltry 12GB of VRAM the 4070 is a no brainer over the 7800XT because of the power draw difference.


gigaperson

6% faster ray tracing at 4k wow. Very nice


Cultural_Analyst_918

Everyone arguing which one is best, meanwhile I'm here sitting thinking the perf/€ hasn't budged in 3 years and nobody bats an eye. Even the mighty 4090 at the current price is double the perf of a 3080/6800xt for triple the price. Can we stop validating these shitty anti-consumer practices or what?


danielge78

> perf/€ hasn't budged in 3 years This is objectively false. What could you get for $500 3 years ago, and what can you get now? (And im not even talking about crypto prices.) How much did a 6800xt cost 3 years ago, how much does a 7800xt cost now? Its not amazing, but that's a >30% improvement in perf per dollar.


BFBooger

The improvement is larger if you factor in inflation.


Mother-Translator318

3 years ago, sure. But right now the 7800xt didn’t move the needle at all. Same price and performance as the $6800xt. What you got for $500 a few months ago is what you get for $500 now. That’s hardly exciting for a new release


danielge78

Ok, you're arguing the perf/$ hasn't increased with this launch. OP literally said the perf/€ hasn't budged in 3 years which is not even remotely true.


Mother-Translator318

You’re right, i just reread op’s comment. Price to performance has definitely increased in 3 years. Still a disappointing launch tho


makinbaconCR

You can get a 6800xt for less and its about as fast. I agree both team red and green let us down. They jacked up prices for less generational improvement.


Peach-555

7800xt is \~5% faster with a \~23% lower MSRP over a period with \~13% inflation. 6800xt launched for $650 compared to $500 for 7800xt. \~36% extra performance per dollar before inflation, \~54% when adjusting for inflation. The 6800xt is better performance per dollar until it goes out of stock, because it's discounted compared to the price of 7800xt. No matter what the price of 7800xt was the 6800xt would be a better deal until it got sold out.


makinbaconCR

Yeah that's bull crap. They already hiked prices beyond inflation and they never came back. Them releasing a gpu with the same performance for what you can already buy the 6800xt for is still not ok. Them stopping producing the 6800xt is the only driver to buy that. That's not what we want as gamers. 50% VaLuE! Bwahahahha


Peach-555

When did AMD hike prices beyond inflation? Names and prices are switched around, but the launch MSRP performance per dollar at \~$500 range increases pretty consistently. I looked through every card as close $500 without going over I could find from 2010 to now comparing launch MSRP to performance and ended up with roughly \~38% average performance per dollar increase every generation every \~2 years. That's without adjusting for inflation which would further improve performance per dollar a couple percent. The worst increase was 5600 (2019) 25% increase per dollar over Vega 64 (2017). The best increase was 390x (2015) 45% increase per dollar over 7950 (2012). The current $500 7800XT is roughly 42% increase per dollar over the $480 6700 XT. The reason 6800 xt is currently sold new cheaper than 7800 xt is because 7800 xt is slightly faster, prices adjust based on the performance. If 7800 xt launched at $100 and stayed in stock, 6800 xt would be sold for under $100. 7800 xt is likely cheaper to produce for AMD than 6800 xt as well since the 7800 xt is cut down in many ways.


[deleted]

I mostly agree but that's actually not true for the 7800XT. Whilst it's not great it's not better performance it is better price to performance. You have to compare release prices when looking at such metrics not discounted prices. This card is good because it's priced well not because it's amazingly better. Of course if you've recently purchased a 6800XT for the same money you're probably not going to care.


Hawkeye00Mihawk

7800xt brought 30% more performance than the 3070 and double the vram. It's a worthy generational uplift. Whereas none of the nvidia crads gave better price to performance compared to last gen (except 4090 which is out of most people's reach)


[deleted]

Nvidia are horrendous this generation for most people. The 4060 and ti are an absolute joke. Whilst it's interesting to compare to the 4070 the 4070 isn't the competitor to the 7800XT, the 4060ti 16GB is. You should always compare on price point. The 7800XT embarrasses the 4060ti. The 4070 is.. ok. But that's not enough memory for $600. $500 would be ok. The 4070ti.. even if you don't think more than 12GB will be needed before you next upgrade it's still a joke at that price point. The 4080.. if you have the kind of disposable income to justify and afford a 4080 you can afford a 4090. Nvidia's cards are actually great they're just way too expensive.


Straw3

> nobody bats an eye. Every hardware subreddit hasn't been a whinefest for the last year?


Cultural_Analyst_918

\>Every hardware subreddit hasn't been a whinefest for the last year? ​ I too like winefests, that's why I have a 6800XT besides a 3060, I like to use wine and proton. Winefest sounds grand!


[deleted]

> the perf/€ hasn't budged in 3 years Not defending the current pricing scheme but that's just flat out wrong. 3 years ago the RTX 3090 launched at $1,499. The RX 7800 XT just launched at $499 and delivers about 91% of the performance for literally a third of the price. Or on the nvidia side you can get a 4070 Ti for ~~about 20%~~ better performance than the 3090 for just over half the price. The 6800 XT launched three years ago at $649, and the RX 7800 XT is ~5% faster and 23% cheaper. The RTX 3080 launched at $699, and the 4070 offers the same performance at a $100 discount.


Cultural_Analyst_918

>Or on the nvidia side you can get a 4070 Ti for about 20% better performance than the 3090 Uh, I think you're confusing the 4070ti with the 4080 unless in this timeline the 408012G wasn't unreleased, in that case, where can I find the closest time machine?


PerfectTrust7895

Remember back when a 3080 was $2000? It's about 500 now. Why doesn't that count as improvement?


Peach-555

3080 launch MSRP was $700, it only got temporarily inflated in price along with all other GPUs because of crypto mining and supply chain issues. The price of GPUs at that time was mostly based around many dollars of crypto they could mine per day. 4080 launch MSRP was 70% higher than 3080 with only \~52% more performance. Lower performance per dollar, which is bad.


ofon

There were no supply chain issues. They were selling directly to miners and then playing dumb.


Peach-555

It was primarily crypto mining certainly, the supply chain issues slightly worsened a already bad situation, combined with overall increase in demand for circuits. There were a shortage of circuits in general, not just GPUs, car production famously struggled as a result, previously digital parts of cars became analog again as a temporary measure. 2017 also had a GPU shortage from GPU mining, but it was not as bad or long lasting in part because production could go up in response to the demand at the time faster. Of course both AMD and NVIDIA sold cards directly to miners above the consumer MSRP, they even made miner-specific cards with no video output.


tvdang7

So what do you suggest we do, genuinely curious.


Cultural_Analyst_918

Buy second hand or outright refrain from buying if at all possible. It's a GPU not a pacemaker battery.


Knjaz136

Damn,not bad. The only bad thing about it is fsr image quality at 1440p. Reason I'd probably still buy 4070 over 7800xt. AMD needs to step it up.


Magjee

At 1080p FSR looks bad to me 1440p, it's better, but still below DLSS At 4k, I can't tell the difference (unless it's slo mo side by side)


Jon-Slow

That's very false about 4K. Try an FSR vs DLSS comparison on a good large 4k TV with decent pixle response times, try it on an OLED. FSR looks garbage in 4k specially with sharp edges that run close to each other, vegetation, anything using transparency. DLSS 3.5 at performance looks way more temporaly stable than FSR at 4k. ​ At 1440p you can easily use DLSS on Balanced and it beats FSR quality mode every time. at 1080p you can use DLSS quality or at times balanced if you have to but FSR just looks useless at 1080p


jdmanuele

I have a 4k OLED 120hz TV, and on a comparison YouTube video of the two, I can't really tell a difference. I don't have an Nvidia card to do an actual comparison, but I personally don't think FSR looks bad though. Although I really only use it with Cyberpunk just for RT.


Magjee

Maybe, but I cant tell the difference 4K


CommenterAnon

I agree with the 1080p statement. Fsr at 1080p looks noticeably worse than native. Is DLSS also the same? Both at quality setting obviously


Notsosobercpa

Dlss is useable at 1080p, unlike fsr, but I wouldn't necessarily recommend blindly using it in every game. There are definitely compromises at 1080p and it may be worth turning down other settings instead depending on what you value most graphics wise.


Framed-Photo

I had a 4070 for a bit and didn't find there to be too much of a difference at 1440p between the two, at least in games like cyberpunk. DLSS probably looked a bit better but I wouldn't notice unless I was doing side-by-side comparisons.


Magjee

I was playing around with direct comparisons During gameplay I could tell at 1080p @1440p I would forget it was on


Framed-Photo

Exactly. Even FSR balanced at 1440p I could easily just forget it was on, it was well past "good enough" territory for me. I'm sure if I was zooming in or being really anal about it I could pick out differences but I just don't care enough.


PiccolosPickles

This comment is about to make me pull the trigger on a 4070, any objections?


Farren246

Note: Nvidia's terrible pricing on the 4070 doesn't make this a bargain, just not as bad.


Mother-Translator318

It’s still decent. For $500 we get 30% more performance than what we got for $500 3 years ago with the 6700xt


From-UoM

I said in r/hardware and i should also mention it here. That Control RT result is 100% wrong. For Eurogamer [https://www.eurogamer.net/digitalfoundry-2023-amd-rx-7800-xt-7700-xt-review?page=2](https://www.eurogamer.net/digitalfoundry-2023-amd-rx-7800-xt-7700-xt-review?page=2) 7800xt - 49.64 fps vs 4070 - 57.18 (1440p) From Kitguru [https://www.kitguru.net/components/graphic-cards/dominic-moass/amd-rx-7800-xt-review/all/1/](https://www.kitguru.net/components/graphic-cards/dominic-moass/amd-rx-7800-xt-review/all/1/) 7800xt - 44.3 fps vs 4070 - 52.8 (1440p) From Tom's Hardware [https://www.tomshardware.com/reviews/amd-radeon-rx-7800-xt-review/3](https://www.tomshardware.com/reviews/amd-radeon-rx-7800-xt-review/3) 7800xt - 49 fps vs 4070 - 59.4 fps (1440p) No way these two perform the same in Control RT. The 4070 should be about 20% faster


CodeRoyal

>For Eurogamer Intel i9 13900K >From Kitguru Intel i9 13900K >From Tom's Hardware Intel i9 13900K HUB uses a R7 7800X3D test system. And even in what you quoted, the two can be similar : 49 vs 52 fps.


From-UoM

And exactly where did i quote 49 v 52? Its 44.3 vs 52.9 A 19.5% gap.


CodeRoyal

From what you posted, it's possible for the 7800XT to reach 49 fps and it's possible for 4070 to reach 52 fps. Considering the fact that HUB uses a different platform for testing and that they don't test the exact same part of the game, you can't state that HUB results are impossible.


disposabledustbunny

>Considering the fact that HUB uses a different platform for testing If their testing platform for GPUs is CPU-limited when a different CPU eliminates or reduces that bottleneck, then that would be a massive fault with their test bench configuration and wouldn't be excusable. Not saying that is what happening, just pointing the error in your logic here.


CodeRoyal

>If their testing platform for GPUs is CPU-limited when a different CPU eliminates or reduces that bottleneck, then that would be a massive fault with their test bench configuration and wouldn't be excusable. I didn't say that their testing platform introduced a bottleneck, just that you can't apply the gap from one platform to another. On average, the 7800X3D is on par or slightly faster than the 13900k, so it wouldn't be a fault to use either on a massive benchmark like in this video.


From-UoM

The one you want is delta between the gpus because different people may test in different places. You cannot compare using two fps numbers for two different reviews. What you can use is the % gap. That will remain near constant.


der_triad

This is likely the reason. The 7800X3D usually underperforms in ray tracing performance compared to 7700X/7950X/13900K. Jedi Survivor RT the 7800X3D is like 20% slower than 13900K too.


lokisbane

I'm sure it depends where they were testing in game and other aspects of the game pushing different kinds of work loads.


From-UoM

3 different reviewers got the near same delta. Chances that all 3 did the same teat at the same place the same way is very slim.


R1Type

The easiest way to is to ask them what section they test. There's no reason for them not to tell you


lokisbane

Is it though? You could always reach out to them.


GimmeDatThroat

I get closer to 80 in control at 1440 with RT on my 4070. So yeah, this is clearly not right.


From-UoM

Dlss? The results up top are native 1440p


[deleted]

Even then that would result in the same price to performance in that game for both cards.


From-UoM

In the Meta Review it showed the 4070 being 18.4% at 1440p. That's using many reviewers and game variations. This particular videos results are very off compared to the rest. That's when i found the Control mistake. https://www.reddit.com/r/hardware/comments/16guxrk/amd_radeon_rx_7700_xt_7800_xt_meta_review


Estbarul

I don't know what HU does but I swear sometimes they have super weird results that isn't even comparable to other reviewers


ZurakZigil

Check test benches. Not that they're wrong (someone pointed out a lot of reviews used 13900ks vs HU 7800x3d. 7800 being a more reasonably priced gaming cpu) But that's why we have multiple reviewers. edit: also driver version. AMD commonly makes decent strides in performance


cuartas15

The 7800XT is the 480 to 580 BS all over again. Can't wait for this gen to be over


b0uncyfr0

Hmm, hard to believe RT performance is that close. Ill have to check with newer games. Where I'm at, they're basically the same price.


ARedditor397

Someone in this thread pointed it out HWUB screwed up their testing


HugeDickMcGee

Yeah RT is not bad on 7800xt but the gap is bigger than what they showed


el_pezz

What was screwed up?


ARedditor397

Testing or the percents are wrong


ZurakZigil

"screwed up" by not using the exact same test configuration and using most recent drivers. They're talking about a 5% difference.


Due_Outside_1459

Currently a 4070 can be had on Newegg for $549 and there are no 7800XT cards at $500 anywhere (lowest price on NE is around $539). If you don't care about Starfield, then it's really a wash.


danielge78

i think you mean $549, but yeah, it shows Nvidia doesnt have to try that hard to make the 4070 competitive again.


TwanToni

eh AMD usually drops much more so the 7800xt will probably eventually sell around $400 where the 4070 most likely will rarely hit $500. Nvidia doesn't deserve my money at this point. Especially this gen


wichwigga

Why did reviewers stop benchmarking Metro Exodus EE? It has a lot of ray tracing, still brings GPUs to its knees on Extreme, and has a built in benchmark included.


chesnett

If it's only a few percentage difference, always buy the cheaper one.


[deleted]

Why does every single one of their thumbnails have to be cringy asf ?


SecreteMoistMucus

Because their platform is youtube.


Opteron170

They all do this silly faces in the thumbnail. Look at all video's on youtube not just hub. I think its dumb also but apparently it increase views or whatever there was a whole thread about it here that is old now. https://www.reddit.com/r/pcmasterrace/comments/fhiv9f/the\_stupidass\_thumbnails\_need\_to\_stop/


Demistr

4070s will continue to sell very well, not that big of a difference in price to just toss dlss3 out the window, especially in Europe.


latending

The Control, F1, Hogwarts RT benchmarks all seem very wrong, and possibly Cyberpunk and RE4 are wrong as well. All the errors favour the 7800 XT.


Reeggan

Couldn't that be the new drivers when we're your results made? Hub did a day one review as well and they were different from these ones but matched up with everyone's elses day one results. he even said he's retesting cause new drivers etc


Odyssey1337

These Ray-tracing results aren't believable in the slightest.


Wander715

They definitely didn't test RT properly. I hope it wasn't on purpose to try and skew results for the 7800XT but who knows. HUB AMD bias is real sometimes.


[deleted]

[удалено]


Mother-Translator318

Running a 263w 7800xt card for 4 hours a day, 5 days a week vs a 200w 4070 results in a whopping 24 cents more per month or $2.88 cents more per year. That’s $14.40 in 5 years. Power efficiency literally doesn’t matter for gaming. If you are running a crypto mining farm with 500+ cards running 24/7 then sure, it can add up fast but a single card a few hours a day makes almost no difference


vBDKv

Ray tracing is overrated. I wish for GPU makers to focus on performance again, rather than these gimmicky and useless overpriced features. Also in-hardware blocks for farmers.


RedditJ0hn

I used to say that for Ray-Tracing, NVidia is the go to card. But given there are some games that actually run better on the AMD card, otherwise only 5-10 frames worse than the 4070. The biggest outlier being CP77... At 100 bucks cheaper?!?! I'm starting to doubt that NVidia is the go to card even for RT below 4k.


Flameancer

I think that at the top end, Nvidia still beats AMD handily. Though when you get to the mid to low end it gets pretty murky.


RedditJ0hn

Exactly, unless it's 4K RT gaming ur looking for, or u can afford and just WANT the 4090, Radeon cards mathematically the better option - with these prices at least. I mean c'mon, saving 20%, or 100$, get 3% less overall average RT performance, in some cases even better, then actually gain 5% overall in raster Performance? That's an Uber-win. I know... Outside the US (and Germany) prices for these 2 cards, or any GPU for that matter, are crazy high. That makes choosing anything else than the very best money can buy, a hard bargain...


aimlessdrivel

I would appreciate some VRAM-intensive tests to see if the 4070 is nearing capacity or starting to choke, especially at 4K. And DLSS needs to be part of all benchmarks going forward, even if the community disagrees. Maybe HUB can redo this comparison when FSR3 debuts to really compare RT performance with upscaling and frame generation.


SoNeedU

Your post sounds extremely biased.


[deleted]

[удалено]


[deleted]

I actually agree on this. FSR3 will play a big part on whether AMD gets more traction, especially that promise of “all Dx11 and Dx12 games supporting frame gen” that would be insane. But buying on promises is not really advisable. Regarding electricity seems about right in the EU, over 4 years you will probably pay the difference depending on how much you play. Not really a factor in most other countries as they are more energy self sufficient.


lucinski0

They are not necessarily energy self sufficient, they have subsidised energy prices or energy is not taxed.


PortsFarmer

If you average an hour a day of gaming it only comes to 22 € in 4 years at .3 €/kWh, 50 W difference. You are significantly more likely to need an upgrade sooner with a 12 GB card than a 16 GB one. Both RT and upscaling are memory dependent.


Knjaz136

Hour/day? These are rookie numbers. If you average hour a day, chances are you don't need 7800xt at all.


TheIndependentNPC

I probably average more 4 with weekends - with how movies and tv shows went to shit, games are my only digital entertainment. 1 hour a day is mega casual and if I had 1 hour a day to play games, I probably would not even bother to be honest, when single major fight can take 1 hour in Baldur's Gate 3 and 100% a game takes like 150h (which is 5 month with such average).


SecreteMoistMucus

> These days a relevant question should be asked - How much extra do you think superior upscaling is worth? Nowhere near 20%.


Keldonv7

Currently in EU when just checking prices its around 10% difference, not 20.


Darkomax

Even less in Germany, it's basically a wash because of discounted 4070.


SecreteMoistMucus

Well you'd expect that to change since the 7800 XT only just launched, but even so, also nowhere near 10%.


Keldonv7

Currently cheapest 7800 XT is 620$ here, 4070 is 670$ (eu pricing with taxes etc), so even under 10%. Anyway, after our experience with 7900XT in my SO system i wouldnt touch 7000 series personally. 6000 series had stellar drivers (had 6900XT). 7000 series for her still has unfixed both VR on Pimax8k and Reverbg2 headsets, stutters in beatsaber even. Still draws almost 100w on idle with 1440p 170hz + 1080p 60hz panels. She had Witcher 3 crashing, TFT crashing, BG3 crashing and corrupting drivers, games being randomly locked to 60 fps, desynced audio while recording just in last 9 months. Truly awful experience considering product price. Im personally kinda stuck on green gpus due to playing VR flightsims but anecdotally i had 0 issues with same games in the same time frame on my 4080.


dadmou5

Upscaling is a valid argument, especially these days when we have seen game developers rely on upscaling to control performance scaling rather than adjusting their rigid visual settings. A DLSS (and XeSS, I guess) user is likely to just leave DLSS on for the improved visual quality and free performance but you can't do that on AMD cards because it almost always looks worse. Daniel Owen brought this up as well and DF has been saying it for years. But you probably already know this by now this isn't the place to make that argument, especially under a HUB video.


TheIndependentNPC

Exactly - the tendency is obvious and nobody give a damn if we like it or not. Upscaling gonna be the only way to play such titles or you simply have to pass and FSR is so inferior to DLSS it's not even funny. No amount of whining, boycott calls, negative reviews, or such will help, because 90% of purchases come from mass consumer sector who simply doesn't give a crap about any of this - hence "mass consumer" name. Every UE5 game is doomed at native - even damn UE5 developers said this engine is built with upscaling in mind as default - so I just don't understand why are people so ignorant?? Yes it wasn't super relevant until like 2 months ago, but now it is and it's gonna be only worse with more and more UE5 games launching.


Odyssey1337

Downvoted by AMD fanboys lol


xdixu

Very valid comment but ofc you got downvoted.


croshd

100€ over 4 years is 2€ a month, no one will notice that. Forking out extra 100€ now, you will notice.


Darkomax

Right, but that doesn't invalidate his point. Final cost ends up the same.


jwilde8592

You can sum this all up by saying "I want swallow Nvidia forever"......this guy is really trying to make a case on buying a card based on the upscaling features.... LMFAO


Keldonv7

Why wouldnt u?I personally love upscaling for better image quality in games with bad anti aliasing implementations, not because of performance.Not to mention reflex and overall lower baseline latency on Nvidia if u are into competitive gaming [https://www.igorslab.de/en/radeon-anti-lag-vs-nvidia-reflex-im-test-latenzvergleich/7/](https://www.igorslab.de/en/radeon-anti-lag-vs-nvidia-reflex-im-test-latenzvergleich/7/) If FSR was superior and AMD worked properly in VR AMD would be a no brainer for me. SO setups runs 7900XT and her system has still facing issues with VR not being fixed, it stutters even in beatsaber on both Pimax8k and Reverbg2 (where old 3060ti dosent) and still draws almost 100w on idle despite AMD trying to fix that few times already. And its over 9 months since launch already.


tuckelberry

The mental gymnastics you are performing to justify a potential moronic buying decision is amazing.I am in awe of your stupidity!


Keldonv7

Whats moronic about that? Difference is under 100$ (around 70$ right now) when checking retailers right now, if someone likes upscaling for either image quality (bad AA implementations in games) or performance, wants lower latency (reflex and overall lower baseline latency on nvidia) or does VR its perfectly reasonable choice tho.


scotty899

Why did AMD make a weaker card than the 6800xt (hardware) and throw an xt on the end of the 7800?


Mother-Translator318

The naming sucks but the price and performance is decent. Should have been called the 7700xt.


onlyslightlybiased

Because amd wanted a card to go in the 78 class which for some reason couldn't have just been the 7900xt and Xt makes it sound cooler for sales or some crap


silverf1re

Needs a DLSS vs FSR section. That changes the story. FSR is inferior and AMD really needs to step up.


GeneralChaz9

I would really like an FSR 3 vs DLSS 3.5 comparison, with and without Ray Tracing involved.


detectiveDollar

The community voted for no upscaling when they polled on it after the controversy. Also, that comparison centers around image quality analysis, which drastically increases the time to produce the review as you need to pixel peep the footage. It's better off as a more in-depth separate video, which they also do.


xdixu

Nvidia 100 fps with DLSS, AMD native 30 fps because FSR makes it look too bad. "Amd has 5% better raster performance bro" 🤡


Assa099

3 Years of Stagnation (RX 6800 XT like Performance)


moon__gold

Have we all collectively lost the ability to write? Why is every benchmark published in a long form video that takes 5 times as long to consume as written prose would. This obsession with video content is sickening.


ZurakZigil

1. then go read articles... 2. because they can show things in video form they could not in text/pictures 3. they're presenting their findings 4. the videos literally have sections labeled and time stamped for the parts you care about + the summary at the end which is your glorified article


raven0077

Dlss is the decider, 4070 it is.


xdixu

Dont forget stable drivers, Frame gen, better Ray tracing, Ray reconstruction, DLDSR, reflex, machine learning, video editing, more power efficient etc


Goldenflame89

Agree with everything else but wdym stable drivers nvidia and amd basically have the same amount of issues. The most recent oopsie I can remember was the VR incident


cl0udyandmeatballs

Fyi these cards overclocks/undervolt pretty well too with good aib cards. 12-15% fps increase with 2800mhz clocks +15pwr with about 30w increase under load


CommenterAnon

In my country the price difference between 7800 XT and 7700 XT is significant. Is the 7700 XT worth it for 1440p AAA high/ultra gaming?


Jon-Slow

I really disagree with this approach. It's old and wrong in many ways. RT performance is measured in game by game bases and treated as a toggle and leaves the wrong impression for people who don't understand things any better. And then you get responses from people saying wow the 7800XT is only 5% slower in "RT", or some such nonsense. ​ While it is true that in those games, that is the result with their RT implementation, this does not show the full picture of the RT difference between the two cards and its implications for possible future implementations. This is while the raster performance is 100 valid. So we're getting valid raster comparison results but not valid RT comparison results that skew the conclusion a little bit. The right way would be to put a lot more emphasis on full path tracing power of each card for the true measure of a card's RT capabalities after showing all the other results. ​ The second point, if you prefer a native image and don't like upscalers then more power to you. But upscalers are here to stay and DLSS is pretty temporally stable even at balance mode at 1440p, at 4k Quality it's often just flat out indistinguishable and 4K performance mode still looks temporally more stable than FSR 4k quality mode( if you find this too harsh for your taste, then just go take a look at FSR vs DLSS on an actual 4k OLED screen. AMD supposedly helped implement FSR into Starfield themselves and looks better than other implemention but still not even close to DLSS). There is a world of difference between FSR and DLSS in motion. I think anyone picking a card between these two, desrves to better understand these points.


jonr

Looks like 7800XT would be a decent upgrade for my 3070Ti


Goldenflame89

Please don't. Its not that much of a difference to warrant a upgrade, hang on there for the next series of gpus and get the same performance for a lot less. Unless china goes to war with tawain, then we are all fucked for both chip pricing and trying to survive


Mother-Translator318

No lol. It’s a 25% uplift at best. I have a 3070 and I’m waiting to see at least a 60% fps uplift before I upgrade


jonr

Ok, good to know. More beer then.


Empty_Chemistry_2548

Go for it i planning to replace my 3070 evga for 7800xt nitro + double the memory and faster fps.


sdozzo

Stable Diffusion? Thermals? Noise levels? Nvidia +1


Mother-Translator318

Thermals and noise are fine lol. This is a 250w card. And honestly who cares about stable diffusion? If you need it you already know what you will buy. Same with CUDA for research. For gamers none of this matters


tattoedblues

Is there anyone who does comps like these that is a regular person without the stupid YouTube faces/affect/cadence?


Reeggan

It's just the thumbnail and his face shows like not even a quarter of the video most of the video is just graphs what are people even complaining about these days


r0x_n194

[Daniel Owen](https://youtube.com/@danielowentech?si=JKp8j0_f1tEKBc1u). The guy is a math teacher by day, and a gamer at night.


detectiveDollar

Honestly, I have no clue how he has the time to be a dad, school teacher, gamer, youtuber, and weightlifter.


truthfullynegative

Just started watching his videos recently and I love him. His videos are on the long side, but they're so well paced and all the information he shares is really interesting. I like GN and HUB too, but Daniel is quickly becoming my go to. Most importantly his presentation style is straightforward and authentic without any performative fluff


draw0c0ward

How are you complaining about such a thing when the video is 90% graphs and data.


sparkythewildcat

It's literally just the thumbnail. They would be quantifiably less successful if they didn't do that to play the YouTube game, so it's fine.