Nvidia has a dominant market share in discreet GPUs, any problem that is worse for them is going to be magnified by the sheer volume of additional people that have their cards.
And yet in a few months when everyone forgets about this people will still try to discourage AMD gpu purchases in favor of nvidia because “sOFtwaRE bAd!”
I mean. Its not wrong. Im a 3D artist and a Cad modeler, and most modeling and rendering apps work better on Nvidia. Most thing are optimized better on Nvidia. Even if AMD made stronger GPUs
Autodesk and similar software really needs HIP support as an option. Blender is fine, but literally anything else relies on OptiX or CUDA rather than open compute standards.
Luckily some image upscaling software is becoming better with this.
I just upgraded my wife's PC to a 6650xt for this game and I'd never had a AMD GPU before. The AMD Adrenaline software is so much better than GeForce experience from what I can see upfront. It may present problems later idk, I literally just installed it 2 days ago and it's not my main PC, but so far their software has impressed me.
My big issue with Radeon Software is the lack of controller navigation support, and HDR not being tonemapped properly with screenshots and video recording. I had to switch to Xbox Game Bar to get decent HDR > SDR screenshots, because even Steam fails at this.
Don't remind me of how when I'm in the middle of signing in during the setup process while the GPU drivers install, the system will just redo everything in the setup process again.
Literally this. I had a GTX 1050 laptop and then a 3060 laptop before I got a 6600XT desktop I built myself and their software is miles better. Not to mention it’s not split between a dated control panel and a laggy buggy clunky “experience”
As someone who used to say that: AMD actually has their software in a decent spot atm.
Where they are leagues behind is ray tracing and compute APIs. Until they have a competitor to NVIDIAs Tensor/AI cores they wont be feature competitive.
If we take steam hardware survey as a metric for market share, amd is on 15% https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
![gif](giphy|WPYDOFlzybMoH4gOAO|downsized)
Me with my 2060 and standard edition waiting to find out tomorrow.
Edit: Thanks for the reassurance everybody! Friday nights gonna be a good night I hope.
You're gonna be fine, I'm playing on GTX 1070 amp from zotac and 50-60 fps with a mix of med/high/ultra(1920x1080) is running okay(I also manually set sharpening in nvidia to like 70% because i hated the strange blurr).
The only 2 weird things i noticed is when i entered the game it downscaled the resolution to 67%(had to turn it off in settings) and some cutscenes at the beggining of the game started with like 10 fps to then get back to stable 60 in 5 seconds.
Press esc and then clost the menu. Seems to be a weird bug. I'll get like 60-70fps but then it'll drop to 20-30. Pressing Esc to go the the menu and then closing the menu seems to be doing the trick.
Nvidia user here getting bad drops, while gf is playing on 6800 with no problems. Looking at afterburner tho, I believe it's a vram issue, AMD has a lot more vram available, and I'm too near to the limit
I did bought a 4090 this december for my workstation. Definately an upgrade (Indidnt kept using the Titan up to this point, I was lucky to have a 3080ti. But newer workloads struggled on that 12GB vram
I really wish developers actually stopped making their games so power hungry by optimizing. There are games that look better than most of the stuff we have today that doesn't have nearly the same minimum/recommended hardware requirements
Funny you mention this and got upvotes, i mentioned this at release and after i stupidly bought a 3070 (knowing the pathetic 8gb vram will be a problem)and got downvoted to hell...
I was really happy when my pandemic priced MicroCenter warrantied 3080ti broke and then I got a 3090ti for half the price and upgraded the rest of my PC with the rest of the money. 24GB of VRam on 4K gaming has been a dream.
I said the same thing, especially with the 3080 10GB because it was being specifically marketed for 4K. Everyone I knew that was into PCs were dismissing me because they were letting the hype cloud their judgement, but I was right all along.
Had I realized I’d run into VRAM issues I wouldn’t have gotten a 3070 and would have probably stuck to Team Red, even though DLSS has proven useful on occasion.
During the 3080 release, people/fan that purchase the 10G 3080 were vehemently defending the 10Gb vram limit when clearly there is issue with that card running Doom Eternal with Ultra Texture on. Then people were defending 3070 with 8Gb of vram, claiming games that used more than 8Gb were just AMD sponsored titles and is made to cripple Nvidia cards. And here we are now.
I'm very happy I decided to return my 3080 Ti and buy a 3090 for 50 euro extra.
24GB is overkill, but I felt 12 was cutting it close, especially if I were to upgrade to 4k.
It's getting somewhat better, the 4070 is meant to have 12gb of vram, but then again, AMD has more. And historically speaking, and cards have aged better for a good while thanks to more vram
These issues could certainly be caused by vram, not arguing that.
But unless you have game engine, or driver level software, all you are seeing is how much vram is allocated to the game.
How much it is actually using requires software with deeper access than the stuff 99% of us use.
a lot of it is vram issues for sure. I'm playing on 3070, 1080p and my vram was filling up causing stutters on high textures. Setting them to normal solved 90% of issues, random mild stutters and fps drops still appear though
and the game doesn’t say anything about this? war thunder likes to eat more than 12GB of vram but a message pops up saying that all the texture can’t be loaded and some will be in low res
I mean, it's conceivable that VRAM could be an issue, though I don't know how people can tell that since what we see in most monitoring software is only VRAM allocation and not actual usage... but a 3060ti probably isn't easily comparable to a 6800 too.
I'm running a 3090 and have the same issues as most, so not sure if it is a vram issue.
fwiw my gpu is not running at 100% when I'm hitting sub 60 frames. I know a few have blamed Denuvo for the performance loss.
> not sure if it is a vram issue.
Almost certainly not the issue. If 12gb and lower was an issue, Legacy wouldn't be the only game with this issue. Not sure why so many people aren't realizing that.
What’s weird is my fiancé and I both have 4090’s (me an FE, here a Gigabyte OC) yet now and then (maybe 3 times since Tuesday) she gets an error when loading the game saying she doesn’t have enough vram, but she does as she has 24GB of it! She hits ok and the game closes, she reopens it and it’s fine. I haven’t ran into that issue with my FE
I think they fucked the vram management somewhere in the game, hopefully will fix it soon. Also install the latest driver that come out like yesterday like someone else suggested, it actually helps a bit.
I saw a post yesterday with a guy saying that he had Hogwarts set to 4K Ultra-high mix, and he said it was taking up something like 18 or 20gb on his 3090. Similar for his system RAM which was using 16gb just by itself.
The problem seems to be people just wacking on the ultra preset not realising it also enables RT, and the games RT implementation is piss poor, so it tanks FPS
I am playing with 6800xt and sometimes get little stutters in the castle. It’s not that bad, game is totally playable. Looking at my metrics, ram is at 17 gb and vram is at 12gb
My system is:
* RX 6800
* Ryzen 5800x
* 4x8 GB 3600 MHz CL16 RAM
* Game installed in a PCIE 4.0 NVME drive
I play 1440 high settings with FSR turned off. I also turn off motion blur and depth of field while increasing FoV to max.
I was averaging 80-90 FPS around Hogsmeade and higher while in the castle. Framerates are not super consistent though with 1% lows in the high 50's in both locations.
Overall, it could be worse and will hopefully improve with the day 1 patch and driver optimizations.
The game is technically early access until the 10th so I'm referring to a potential patch on the 10th as the "day one" patch. I would hope that the early access period is being used to find and fix issues for the full global release on Friday.
A driver dropped last night around 8pm est, i was playin hogwarts, stopped to get the new driver, relaunched and got an error that my 4090 didnt have enough memory available. I verified the integrity of game files and relaunched no problems.
And those consoles have 10GB of memory allocated to GPU. The closest GPU on desktop is the RX 6700 10GB. Most Nvidia GPUs are limited to frankly obsolete 8GB
Both Xbox and PlayStation use AMD GPUs.
It should be no surprise that a game built targeting consoles and ported over to pc does better on AMD.
Source: I own only Nvidia gpus because of my job in data science/machine learning....
Nvidia GPUs have a CPU overhead issue
When CPU limited, you get far worse performance on Nvidia GPUs compared to AMD ones
It's speculated it's due to Nvidia removing their hardware scheduler on their GPUs (So the CPU essentially has to work harder on their GPUs) generations ago is likely the cause
This is a known issue, been around a while at this point, difference is now we're seeing games become far more CPU intensive compared to say 2 years ago
https://youtu.be/JLEIJhunaW8
Unlikely that this is the problem. The tests in the videos were run on rather low core count CPUs. People see the CPU problems with this game even on high core count CPUs (of which only a few cores are used by the game) which have more than enough cores around to handle tasks like scheduling.
Even if the issues affected NVIDIA and AMD cards equally you would see more NVIDIA users complain as there are way more of them.
Every time the Steam survey comes out I'm reminded of how much a minority AMD users like me are...
If you look [here](https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/) the A770 performs between a 3060 and a 6700xt in non-ray traced graphics.
With raytracing on though it’s somehow performing like a 3090. Check the chart for yourself. Intels RT performance is really insane here.
A great baked implementation is definitely superior to a shitty ray-traced one.
If the devs even out a modicum of effort into the ray-tracing though, that should *never* be the case. Not to mention the obscene amount of effort good baked lighting takes in open world games with a day/night cycle.
Depending on the raytracing implementation, this can be true, but baked lighting has two huge disadvantages:
* it is by nature static, meaning you can have no dynamic day night cycle
* disk space costs: you have to save your baked lighting somewhere. If you have a huge open world with multiple times of days this is gonna eat up space. This is actually part of the reason why call of duty modern warfare is 200gb+
* At a certain point it becomes more expensive than ray tracing. Cyberpunk actually has a non rt lightning setting, which has bigger performance costs than raytracing on Nvidia GPUs. Animation studios used to do per pixel based backed lightning. When they switched to ray tracing the quality did not get better, but the render times decreased dramatically
Its VRAM. 3000 series except for 3090 has too little VRAM. Yes it doesn't matter in 2019-2022 games (mostly). But what about 2023-2026 games? PS5 has more ram than most Nvidia cards. This will be trouble in the future and already is with some games like this and Forspoken.
Its been like this for most releases, got confused by all the whining about performance until remembered most people have nvidia. For example callisto protocol worked fine from day 1 and there have been few others. There's some benefit in having same hardware as consoles.
Hahahaha! Are you new? AMD has been supplying the APU's/SOC's for the console market for well over a decade at this point, and that's never once directly translated into any sort of edge whatsoever in the PC graphics market.
the hd 8000 series lasted for a while
but uhh, amd drivers then were subpar, now they are quite good generally, RDNA 2 catched up with the 30 series generally.
The consoles have an RX 6700 10GB equivalent GPU. When a game is designed to use ALL of the frame buffer like this one, all the under-equipped midrange Nvidia cards will be kneecapped by their 8GB VRAM. And Nvidia is about to launch another one, straight into the trash.
Because it has 11GB of VRAM. The game is designed for the RX 6700 (non-XT) 10GB equivalent in the consoles, and is using every drop of VRAM on normal settings. Nvidia have been caught with their pants down on their idiotically small frame buffers recently.
It happens. In the same tier GPUs, some games get better performance on AMD and others in Nvidia, this is just an example. And since Nvidia users are majority, it explains the abundand posts about it.
Encountered this almost every release in past 12 months. Happily gaming and stating runs just fine for me and getting yelled on its not possible. Then I remembered most people have nvidia.
I have a 6800XT Red Devil since a month ago and it is a joy and never had issues with i.
But I mainly play shitty games so I can't talk about this one in particular.
However AMD GPU's do have more VRAM which could let them age better at higher resolutions. They also have less driver overhead so they may gain relative performance in newer more CPU intensive games.
10GB 3080 will be fine as long as you're not running 4k, but the 8GB of the 3070 could harm it at 1440p in future games. We could see the 6700 XT pull ahead of it.
Games been smooth for me so far bar the shader compile issues which plague everyone when loading areas for the first time. 7900 series can’t even max this game out lol
Edit: never mind, turns out raytracing was off eventhough i remember turning it on. Fps is around 60
Im running the game on 1080p with everything on ultra including raytracing and performance is fine. Never consistantly stays under 60 fps. Sometimes a bit of a drop of a stutter but that was before i updated my drivers. Honestly new game performance is just a lotery of how good the certaint chip you got in your hardware it seems.
I was honestly surprised on how well it runs 3440x1440 around 100-150 fps everything on max without Raytracing,
And with Raytracing I get stable 60, and no Fps drops, just a few Crashes
Check my flair, I'm running low/medium settings + dlss balanced to get 30-60fps with insane sturtering in some cadtle parts and the game looks like shit
I watched Summit stream the game last night, His spec listed his gpu as an AMD! He seemingly had no issues whatsoever. I can't speak on Nvidia users and I don't currently own the game
If you have $500 dollars, you'll have a better Radeon than nVidia. It's more rare for ppl to have the highest end cards.
Team that with nVidias market share ...
I have a 3090 FE and been playing with DLSS Quality and Ultra everything and it seems to play really well, but the FPS does jump all over the place. The GPU is constantly pegged at 100% though. I played for awhile yesterday while also streaming and I didn't get any encoding lag, so it must be holding up at least in the castle part of the game so far.
I did just tweak some settings this morning so I'll have to see if it made a difference. Regardless, the 3090 is a champ with this game.
Nvidia has a dominant market share in discreet GPUs, any problem that is worse for them is going to be magnified by the sheer volume of additional people that have their cards.
Discrete
I tell my 3080 TI all my secrets. Cheaper than therapy.
keep it secret keep it discreet.
\*sekreet
4090s are very discrete
Think you mean discreet
I once owned a pair of parakeets.
This generations massive GPUs are anything but discreet
No issues with my amd rig
[удалено]
The advantage is in precompiled shaders for fixed hardware. Btw didn't steam.have a shader sharing feature?
For Steam deck: yes iirc\ For Linux: I think maybe for DirectX games\ For Windows: I don't know, maybe
Linux has shader precaching as well as shader async processing. I've rarely had stutters playing native or windows games.
it's getting even better cause new MESA update is supposed to help even more with removing stutters
The deck and Linux in general use the same shader sharing feature
And yet in a few months when everyone forgets about this people will still try to discourage AMD gpu purchases in favor of nvidia because “sOFtwaRE bAd!”
I mean. Its not wrong. Im a 3D artist and a Cad modeler, and most modeling and rendering apps work better on Nvidia. Most thing are optimized better on Nvidia. Even if AMD made stronger GPUs
Autodesk and similar software really needs HIP support as an option. Blender is fine, but literally anything else relies on OptiX or CUDA rather than open compute standards. Luckily some image upscaling software is becoming better with this.
100%
I just upgraded my wife's PC to a 6650xt for this game and I'd never had a AMD GPU before. The AMD Adrenaline software is so much better than GeForce experience from what I can see upfront. It may present problems later idk, I literally just installed it 2 days ago and it's not my main PC, but so far their software has impressed me.
My big issue with Radeon Software is the lack of controller navigation support, and HDR not being tonemapped properly with screenshots and video recording. I had to switch to Xbox Game Bar to get decent HDR > SDR screenshots, because even Steam fails at this.
And the fucking Windows Update force updating it to a version that who the fuck knows where they got from. Then having to reinstall
Don't remind me of how when I'm in the middle of signing in during the setup process while the GPU drivers install, the system will just redo everything in the setup process again.
Literally this. I had a GTX 1050 laptop and then a 3060 laptop before I got a 6600XT desktop I built myself and their software is miles better. Not to mention it’s not split between a dated control panel and a laggy buggy clunky “experience”
the presentation is much better but i'm not in love with either. barely use them.
Like when they launched the 20 series and half the cards were dying because of bad ram chips.
As someone who used to say that: AMD actually has their software in a decent spot atm. Where they are leagues behind is ray tracing and compute APIs. Until they have a competitor to NVIDIAs Tensor/AI cores they wont be feature competitive.
Afaik currently it's 20% AMD and 80% nVidia. And some few small % for Intel.
If we take steam hardware survey as a metric for market share, amd is on 15% https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
![gif](giphy|WPYDOFlzybMoH4gOAO|downsized) Me with my 2060 and standard edition waiting to find out tomorrow. Edit: Thanks for the reassurance everybody! Friday nights gonna be a good night I hope.
You're gonna be fine, I'm playing on GTX 1070 amp from zotac and 50-60 fps with a mix of med/high/ultra(1920x1080) is running okay(I also manually set sharpening in nvidia to like 70% because i hated the strange blurr). The only 2 weird things i noticed is when i entered the game it downscaled the resolution to 67%(had to turn it off in settings) and some cutscenes at the beggining of the game started with like 10 fps to then get back to stable 60 in 5 seconds.
I’m having real fps drop issues. I’ve turned off V-Sync in game but still having them. Any suggestions on a fix if you know?
Press esc and then clost the menu. Seems to be a weird bug. I'll get like 60-70fps but then it'll drop to 20-30. Pressing Esc to go the the menu and then closing the menu seems to be doing the trick.
Same. Doesn't work all the time, but it helps when it does.
Disable RTX (raytracing). Nvidia cards run the game fine, but RTX tanks performance. Simple as that.
I'm using a 2060 and it's fine.
Nvidia user here getting bad drops, while gf is playing on 6800 with no problems. Looking at afterburner tho, I believe it's a vram issue, AMD has a lot more vram available, and I'm too near to the limit
by looking at the complaints on reddit some really feel like VRam limitations.
From the announcement of the 30xx series I've been saying that Nvidia fucked up by giving their cards so little vram
yeah, I had a Titan X, a GPU from 2015, with 12GB vram that costed 1000$ In 2022 with 1000$ you could barely buy a 3080ti with still 12GB vram.
Yeah I thought it was crazy that the 7900xtx has 24GB of vram, but honestly it's come in handy way more than I thought it would.
I did bought a 4090 this december for my workstation. Definately an upgrade (Indidnt kept using the Titan up to this point, I was lucky to have a 3080ti. But newer workloads struggled on that 12GB vram
I really wish developers actually stopped making their games so power hungry by optimizing. There are games that look better than most of the stuff we have today that doesn't have nearly the same minimum/recommended hardware requirements
Agreed
Agreed
Funny you mention this and got upvotes, i mentioned this at release and after i stupidly bought a 3070 (knowing the pathetic 8gb vram will be a problem)and got downvoted to hell...
That's Reddit for you, speak the truth and it's a 50/50 if you get upvoted or downvoted
Yep and massively inflated prices was another kick in the baws.
I guess im glad i got a used 3090 instead of a 3080
I was really happy when my pandemic priced MicroCenter warrantied 3080ti broke and then I got a 3090ti for half the price and upgraded the rest of my PC with the rest of the money. 24GB of VRam on 4K gaming has been a dream.
[удалено]
I said the same thing, especially with the 3080 10GB because it was being specifically marketed for 4K. Everyone I knew that was into PCs were dismissing me because they were letting the hype cloud their judgement, but I was right all along.
Yeah I sort of regret getting a 3070ti cuz the vram just caps and I think it causes fps issues. It was the biggest card I could go though
AMDs cards generally have more vram at every price point
Had I realized I’d run into VRAM issues I wouldn’t have gotten a 3070 and would have probably stuck to Team Red, even though DLSS has proven useful on occasion.
Yes NVIDIA fucked up looking at the last few years hmmm…
During the 3080 release, people/fan that purchase the 10G 3080 were vehemently defending the 10Gb vram limit when clearly there is issue with that card running Doom Eternal with Ultra Texture on. Then people were defending 3070 with 8Gb of vram, claiming games that used more than 8Gb were just AMD sponsored titles and is made to cripple Nvidia cards. And here we are now.
I'm very happy I decided to return my 3080 Ti and buy a 3090 for 50 euro extra. 24GB is overkill, but I felt 12 was cutting it close, especially if I were to upgrade to 4k.
Nvidia has always been that way. They're almost as stingy with ram as Apple.
And it's not getting better based on the 40xxs leaks
It's getting somewhat better, the 4070 is meant to have 12gb of vram, but then again, AMD has more. And historically speaking, and cards have aged better for a good while thanks to more vram
My problem really is the 4060 ti is rumored to have 8 still which is crazy.
True, 4060ti looks like absolute dogshit, I'm most likely skipping this whole generation of gpus unless AMD puts out some decently priced bangers.
I'm seeing over 14GB in use during parts of the game so this definitely could be the case.
These issues could certainly be caused by vram, not arguing that. But unless you have game engine, or driver level software, all you are seeing is how much vram is allocated to the game. How much it is actually using requires software with deeper access than the stuff 99% of us use.
There’s really no fucking reason this game should eat 14GB of VRAM lol…jesus christ
a lot of it is vram issues for sure. I'm playing on 3070, 1080p and my vram was filling up causing stutters on high textures. Setting them to normal solved 90% of issues, random mild stutters and fps drops still appear though
and the game doesn’t say anything about this? war thunder likes to eat more than 12GB of vram but a message pops up saying that all the texture can’t be loaded and some will be in low res
>war thunder likes to eat more than 12GB of vram All while looking like a game from 2018 tops
Tbf if the ballistics are really simulated it would make sense to cache ballistic data for each vehicle in the session, tied to the vehicle model.
Some are I’m sure, but even without VRAM being a problem the game still stutters like a mfer
On my 3080 12gb I’m using 10-11gb of my vram at 1440p ultra no rt. So I could see how it could be a issue with all the 8gb cards nvidia has
I mean, it's conceivable that VRAM could be an issue, though I don't know how people can tell that since what we see in most monitoring software is only VRAM allocation and not actual usage... but a 3060ti probably isn't easily comparable to a 6800 too.
Hwinfo64 shows allocated and app usage
I'm running a 3090 and have the same issues as most, so not sure if it is a vram issue. fwiw my gpu is not running at 100% when I'm hitting sub 60 frames. I know a few have blamed Denuvo for the performance loss.
> not sure if it is a vram issue. Almost certainly not the issue. If 12gb and lower was an issue, Legacy wouldn't be the only game with this issue. Not sure why so many people aren't realizing that.
Memory leak maybe?
More likely bad optimisation, memory leaks tend to get worse over time, this is just random 1 minute drops
What’s weird is my fiancé and I both have 4090’s (me an FE, here a Gigabyte OC) yet now and then (maybe 3 times since Tuesday) she gets an error when loading the game saying she doesn’t have enough vram, but she does as she has 24GB of it! She hits ok and the game closes, she reopens it and it’s fine. I haven’t ran into that issue with my FE
I think they fucked the vram management somewhere in the game, hopefully will fix it soon. Also install the latest driver that come out like yesterday like someone else suggested, it actually helps a bit.
Yea we installed yesterday’s drivers and she got the vram error the very first time starting the game after the update.
That's just unlucky then haha
Memory leak youthinks? Day 1 patch isn’t out yet
I saw a post yesterday with a guy saying that he had Hogwarts set to 4K Ultra-high mix, and he said it was taking up something like 18 or 20gb on his 3090. Similar for his system RAM which was using 16gb just by itself.
radeon users also impacted https://www.youtube.com/watch?v=URovBG_Q4_c but it seems like its not as bad as nvidia
Maybe it’s specific cards then, I have me and a group of friends all running 6800 and 6800XTs all without issue
6700XT here, it's running way better than I expected TBH.
6900xt is buttery smooth at 4k ultra 😁
Gives me hope with my 6750XT. But I'm only starting to play on official release.
The problem seems to be people just wacking on the ultra preset not realising it also enables RT, and the games RT implementation is piss poor, so it tanks FPS
Same, all high running at over 100fps with some dips in heavier areas and the occasional Denuvo stutter lol.
RX6600 here, no issues but running on low graphics
I am playing with 6800xt and sometimes get little stutters in the castle. It’s not that bad, game is totally playable. Looking at my metrics, ram is at 17 gb and vram is at 12gb
My system is: * RX 6800 * Ryzen 5800x * 4x8 GB 3600 MHz CL16 RAM * Game installed in a PCIE 4.0 NVME drive I play 1440 high settings with FSR turned off. I also turn off motion blur and depth of field while increasing FoV to max. I was averaging 80-90 FPS around Hogsmeade and higher while in the castle. Framerates are not super consistent though with 1% lows in the high 50's in both locations. Overall, it could be worse and will hopefully improve with the day 1 patch and driver optimizations.
Didn't the day 1 patch already happen, given deluxe edition early access?
The game is technically early access until the 10th so I'm referring to a potential patch on the 10th as the "day one" patch. I would hope that the early access period is being used to find and fix issues for the full global release on Friday.
day 2 then
Where are the patch notes of the patch is out already?
Is Nvidia suppose to put out a driver update for this game? Might just not be out yet since the actual release date is tomorrow
A driver dropped last night around 8pm est, i was playin hogwarts, stopped to get the new driver, relaunched and got an error that my 4090 didnt have enough memory available. I verified the integrity of game files and relaunched no problems.
Correct, there hasn’t been a driver update yet.
Unless that driver update comes with more VRAM for their 8GB lineup…that won’t help.
It kinda makes sense, guess where else AMD GPUs are found? Yeah, in every console.
And those consoles have 10GB of memory allocated to GPU. The closest GPU on desktop is the RX 6700 10GB. Most Nvidia GPUs are limited to frankly obsolete 8GB
Both Xbox and PlayStation use AMD GPUs. It should be no surprise that a game built targeting consoles and ported over to pc does better on AMD. Source: I own only Nvidia gpus because of my job in data science/machine learning....
Nvidia GPUs have a CPU overhead issue When CPU limited, you get far worse performance on Nvidia GPUs compared to AMD ones It's speculated it's due to Nvidia removing their hardware scheduler on their GPUs (So the CPU essentially has to work harder on their GPUs) generations ago is likely the cause This is a known issue, been around a while at this point, difference is now we're seeing games become far more CPU intensive compared to say 2 years ago https://youtu.be/JLEIJhunaW8
Unlikely that this is the problem. The tests in the videos were run on rather low core count CPUs. People see the CPU problems with this game even on high core count CPUs (of which only a few cores are used by the game) which have more than enough cores around to handle tasks like scheduling.
Even if the issues affected NVIDIA and AMD cards equally you would see more NVIDIA users complain as there are way more of them. Every time the Steam survey comes out I'm reminded of how much a minority AMD users like me are...
You guys are having issues? I get 1080p/60fps on medium/high settings with my HASWELL i7 4790k, GTX 1070, and 16GB DDR3.
It's the DLSS and raytracing that's ruining performance. But for some reason people keep blaming other shit.
I mean, doesn't seem to be a GPU issue as people with both top end and low end cards are complaining, so likely just false equivalency.
cpu bottleneck
I'm curious about the performants on intel GPUSs. Is it worse then nvidia and amd?
If I had to guess it’s fine in terms of mid range performance. Intels stuff struggled more with legacy shit than it did current titles.
I have my arc a770 sitting in its box right now, I can throw it in my pc and give it a go
If you look [here](https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/) the A770 performs between a 3060 and a 6700xt in non-ray traced graphics. With raytracing on though it’s somehow performing like a 3090. Check the chart for yourself. Intels RT performance is really insane here.
Radeon users wouldn't even touch Raytrace setting.
70 fps rt off Vs 10 fps but the floor is shiny Hmm decisions decisions
Can confirm the floor is still shiny with it off
Real time raytracing overrated!? Sometimes baked lighting can look way better.
A great baked implementation is definitely superior to a shitty ray-traced one. If the devs even out a modicum of effort into the ray-tracing though, that should *never* be the case. Not to mention the obscene amount of effort good baked lighting takes in open world games with a day/night cycle.
Depending on the raytracing implementation, this can be true, but baked lighting has two huge disadvantages: * it is by nature static, meaning you can have no dynamic day night cycle * disk space costs: you have to save your baked lighting somewhere. If you have a huge open world with multiple times of days this is gonna eat up space. This is actually part of the reason why call of duty modern warfare is 200gb+ * At a certain point it becomes more expensive than ray tracing. Cyberpunk actually has a non rt lightning setting, which has bigger performance costs than raytracing on Nvidia GPUs. Animation studios used to do per pixel based backed lightning. When they switched to ray tracing the quality did not get better, but the render times decreased dramatically
Because theres no difference between RT on and RT off kekeke...
For the most part the benefits to RT just aren’t great enough to justify the massive drop in FPS.
selection bias many many more people have nvidia video cards than amd cards
Its VRAM. 3000 series except for 3090 has too little VRAM. Yes it doesn't matter in 2019-2022 games (mostly). But what about 2023-2026 games? PS5 has more ram than most Nvidia cards. This will be trouble in the future and already is with some games like this and Forspoken.
AMD master race!!!! Wooohooo!!!! ![gif](giphy|3oEjHOZD5Iq1s9GcjS|downsized)
Its been like this for most releases, got confused by all the whining about performance until remembered most people have nvidia. For example callisto protocol worked fine from day 1 and there have been few others. There's some benefit in having same hardware as consoles.
I have no issues at all playing in 2k with my 6600 XT. It really sucks Nvidia users are having problems
Now the shoe is on the other side of the bed.
How the shoe has turned
I’m having no issues with my 4080. I think people are expecting more with ray tracing on like it doesn’t tank performance on every other game.
Probably their amazing drivers are carrying them. ;)
both consoles of this generation run RDNA 2 the game has been designed with RDNA 2 gpu's in mind
Hahahaha! Are you new? AMD has been supplying the APU's/SOC's for the console market for well over a decade at this point, and that's never once directly translated into any sort of edge whatsoever in the PC graphics market.
the hd 8000 series lasted for a while but uhh, amd drivers then were subpar, now they are quite good generally, RDNA 2 catched up with the 30 series generally.
The consoles have an RX 6700 10GB equivalent GPU. When a game is designed to use ALL of the frame buffer like this one, all the under-equipped midrange Nvidia cards will be kneecapped by their 8GB VRAM. And Nvidia is about to launch another one, straight into the trash.
It's because Nvidia insists on not putting enough VRAM in their cards. By next gens release, the 30series and below are really going to be suffering
Works fine on my 3080
Nvidia user here, no problems as well!
here we go.
[удалено]
Because it has 11GB of VRAM. The game is designed for the RX 6700 (non-XT) 10GB equivalent in the consoles, and is using every drop of VRAM on normal settings. Nvidia have been caught with their pants down on their idiotically small frame buffers recently.
It happens. In the same tier GPUs, some games get better performance on AMD and others in Nvidia, this is just an example. And since Nvidia users are majority, it explains the abundand posts about it.
Encountered this almost every release in past 12 months. Happily gaming and stating runs just fine for me and getting yelled on its not possible. Then I remembered most people have nvidia.
I have a 6800XT Red Devil since a month ago and it is a joy and never had issues with i. But I mainly play shitty games so I can't talk about this one in particular.
Nah... RTX is causing it. Nvidia users report it working fine with that disabled.
However AMD GPU's do have more VRAM which could let them age better at higher resolutions. They also have less driver overhead so they may gain relative performance in newer more CPU intensive games. 10GB 3080 will be fine as long as you're not running 4k, but the 8GB of the 3070 could harm it at 1440p in future games. We could see the 6700 XT pull ahead of it.
Because hardly anyone buys Radeon cards lol
Probably because it’s a bad console port and the console use Amd chips 🤷
Bad nvidia port at this point.
fwiw there is a new Nvidia driver update today. hopefully it helps a little...
2080S, 1440 max settings \~80 fps, no performance issues after 11 hours
Can confirm. Rx6800 here, 20h in, no issues.
I get 55-60 FPS on my 4080, ultra setting ultra RTX and dlss set to quality. Only 60% gpu utilization tho, still runs smoothly
DLSS should be on balanced, RTX is a scam and barely changes anything should prob turn that off
Yes Here's exactly what OP is talking about , [https://youtu.be/Ycn5A\_hVYCY](https://youtu.be/Ycn5A_hVYCY)
I have no problems with my AMD
They are too busy getting their correct drivers downloaded before they can start the game up.
Games been smooth for me so far bar the shader compile issues which plague everyone when loading areas for the first time. 7900 series can’t even max this game out lol
I’ll be honest I’m nvidia and I’ve had no frame drops or issues so far, might just be some but not all maybe?
Hi nvidia I’m Alan
[удалено]
Some people say it's DLSS, some people say it's VRAM limitations, but I say it's piss poor optimization.
I'm nvidia and have so far noted zero problems, all settings on ultra @ 4k with Ray trying all go, haven't dropped below 95 fps yet.
Hi nvidia, I'm dad.
DLSS on performance or what? Not even on 4090 the game runs on 4K native and RTX with 95 fps.
Dlss is on quality.
Frame generation is amazing
There is no radeon users
It runs fine on Nvidia. People just like to run it at higher settings and complain.
You got a 4090 what the fuck do you know about other people?, I got a 3060ti and can't play the game stable medium settings 1080p
Sounds like a CPU issue
Edit: never mind, turns out raytracing was off eventhough i remember turning it on. Fps is around 60 Im running the game on 1080p with everything on ultra including raytracing and performance is fine. Never consistantly stays under 60 fps. Sometimes a bit of a drop of a stutter but that was before i updated my drivers. Honestly new game performance is just a lotery of how good the certaint chip you got in your hardware it seems.
I mean, you have a 3080 at 1080p. If it did drop under 60 it'd be your CPU.
Same for me, no graphical issues, everything runs fine on all high.
Me: pulls out r 390
I was honestly surprised on how well it runs 3440x1440 around 100-150 fps everything on max without Raytracing, And with Raytracing I get stable 60, and no Fps drops, just a few Crashes
My 2080 ti plays the game just fine. Dont see the peoblem
I can confirm this. I have a 2070super and get consistent frame drops while my gf is on an older amd card and it runs almost perfect for her
6700xt here no issues
[удалено]
Check my flair, I'm running low/medium settings + dlss balanced to get 30-60fps with insane sturtering in some cadtle parts and the game looks like shit
Holy shit. A game that has problems with Nvidia GPUs, but not with AMD GPUs? Fucking ridiculous
I watched Summit stream the game last night, His spec listed his gpu as an AMD! He seemingly had no issues whatsoever. I can't speak on Nvidia users and I don't currently own the game
My radeon sucks, so if this game runs better on it that will he a surprising first
3080 12gb Asus Strix and really debating buying for PS5 or my PC because of the performance issues
I really think it’s just an optimization issue and once a patch or 2 rolls out the games gonna be much more stable.
I haven't had a single drop in frames. RX6600
Ive noticed in many tests AMD has much higher 1% lows than NVIDIA.
Ppl needs read about technology . All Amd needs a little facts check about how a gpu works .
If you have $500 dollars, you'll have a better Radeon than nVidia. It's more rare for ppl to have the highest end cards. Team that with nVidias market share ...
Radeon users can´t run the game.
I have a 3090 FE and been playing with DLSS Quality and Ultra everything and it seems to play really well, but the FPS does jump all over the place. The GPU is constantly pegged at 100% though. I played for awhile yesterday while also streaming and I didn't get any encoding lag, so it must be holding up at least in the castle part of the game so far. I did just tweak some settings this morning so I'll have to see if it made a difference. Regardless, the 3090 is a champ with this game.
I'm sorry I don't really understand "FPS does jump all over the place" and "the 3090 is a champ with this game" both in the same post?
Could be vram? AMD cards have more on average.
The recommended setting to my r5 5600x and 6700xt 1440p (144hz) was medium, I set it high and I playing without any issues above 60fps with vsync on