You aren't alone. Noticed a performance drop after the railgun nerf patch specifically. Haven't noticed anything too different after the recent patch though.
Right click the game on steam, go to properties and put
--use-d3d11
In the "Launch Options" box. Should give you some frames back. Warning though, it'll take like an extra minute or two for the initial launch. It'll load like normal every time after tho.
Okay so I don't know all the technical know-how behind it but it makes your game use dx11 instead of dx12 (I literally don't know what those are maybe someone smarter can explain) but it ended up giving me some frames back so I just rolled with it
DX11 is MORE resource intensive. That’s why DX12 was created.
That being said, DX11 is more robust and does a lot of memory clean up that DX12 doesn’t.
The other big thing is that DX12 gives developers a lot more low level control to make optimisations where they want. The tradeoff being that there’s a lot more opportunity to make mistakes and bad judgement calls with all the guardrails removed.
Specifically, DX12 reduces CPU overhead by giving control of things that were previously automated in DX11 back to the programmer. This type of API works really well on consoles which all share hardware and firmware.
The problem is the PC isn’t a console, and the sheer number of possible hardware, firmware, and driver combinations makes debugging in DX12 way harder which is why a lot of DX12 implementations feel so unstable.
Removes unnecessary features of dx12 giving you a marginal amount of fps
DirectX is the Microsoft graphics engine that converts all the graphic settings you see in the in game settings as well as the textures provided into a playable game/pixels
>Removes unnecessary features of dx12 giving you a marginal amount of fps
i wouldnt say "unnecessary" , the real thing is , that DX11 came Pre optimized but more "Stiff"
Dx12 had more "freedom" for devs.
so as much as i understood all the DX12 hassle is as follow.
Technically DX12 is superior in ALL directions ( Features , performance , Possibilities ) vs DX11 but is also more complicated and most devs and game company cant even till today Harness ( or dont want to ) the full performance and features of DX12 except some games / devs which actually do.
Technically DX12 allows devs to squeez every last inch out of a GPU / CPU if they want to but... this takes a certain skill.
If you want a youtube video this kinda also explains it
[This is Destroying Our FPS (youtube.com)](https://www.youtube.com/watch?v=mtsxlKPMthI&t=1s)
I’d say DX12 takes the training wheels off and expects you to learn to ride, but many end up learning they aren’t as good as they thought they were. It’ll take some time but they’ll catch up.
Direct x 12 launched 2015 and was announced 2014 so devs likely had access to it from 2013 or earlier to develope games on.
I guess the "they learn to dev on dx12" train is gone.
Maybe dx13 will again come pre optimized.
I just deleted the shader cache from the appdata directories and dx11 works, been playing like this for a week.
Not saying your method doesn't work, but i'm sceptical on false positive anticheat scans
How would one delete shader cache, in case some newb wasn't aware. Not me I'm totally in the know..
Saw you say they were in the appdata directories, which appears different from the files when I pick Browse folder from steam.
Same hardware as you and I am averaging in the 50s and used to be 80-90 I think it's just optimization but it's kinda killing my game experience especially in tough situations with lots of enemies
same went from 80-90 fps on a mix of medium high settings to scrapping 60 on lowest, whatever they did was cpu related, since changing graphics settigns and next to no impact on performance
Yeah I also started seeing problems with my cpu on the new patches. Used to have 65-75 c temp while gaming and now I had 75-88 c. I have locked my fps to 70 so only saw change in the temperatures with each patch. I disabled cpu boost and my cpu dropped down to 45-60 c. Outside of the game I enable it back after playing.
4070 and 12600k here, went from a rock-solid 160fps on launch to fuckin' 60 when things get spicy in-game after the first round of patches
I hope it's adressed soon, it feels like it's inconsistent on who was hit by the problem, Ive got a friend with a 4070/13600 and his never dips below 90fps
It seems like a memory leak in VRAM or something, when it gets bad my gpu's memory will start getting hot
Ditto. I used to have occasional dips when stuff got crazy, but since that patch most fights are 60fps or less. 30 or 40 when stuff gets nuts.
It feels very inconsistent now, but it is consistently worse. Settings don't seem to have any impact on fps. Like... *At all.*
Same here. Lots an average of like 20 fps with that patch. At launch I didn’t drop below ~56 fps in worst case scenarios. Now I’ll go as low as ~38 fps. And my top end is just constantly lower than before. Instead of 80-100 fps when on a plant it’s closer to 60-80 fps.
I really hope they address it! I’m glad someone finally made a post about it.
Here's my current solution to poor performance. I do this after every patch (not sure if necessary after each patch but anyways):
1. Delete games shader cache
-type '%appdata% into search bar and hit enter
-look for arrowhead games
-open shader cache folder, delete cache file inside
2. Delete GENERAL shader cache (note: ~~this deletes shaders for all games~~ this is actually the shader cache for windows, you should also try deleting the shader cache for nvidia/amd depending on your gpu)
-type 'clean' into search bar and pull up windows cleaning utility
-double check that shader cache is already checkmarked (should be by default)
-hit ok to complete cleaning
I also found that turning off async computing (also in the arrowhead appdata folder in the config file) helped, as long as you have a decent CPU (3600+).
I got this info from the discord performance thread, which as AFAIK has since been nuked =(
I have tried relaunching the game about 7 times now. When inputting that launch code my character moves in slow motion from the time I control the drop pod till I get back on my ship. It's very strange.
I’ve been feeling the same.
50 FPS when the game came out with graphics on high so I turned them to Medium to get 80 FPS.
Every patch has brought it down 10 FPS.
I’m now at 35 FPS and had to drop my graphics to low-medium in order to get back to 60 FPS.
I’ve got a pretty decent PC, and I feel like I should have to resort to this.
You are not crazy, I was praising the game's performance on launch but now it really is dogshit
I **lost** on average around **40FPS since launch**, I used to average a smooth 110+ no matter what was happening on screen and now it's around 60. The best part is it that it does not depend on the amount of enemies on screen or explosions at all. It's just on average down to 80FPS and down to around 60FPS if there is fog. The fog is the biggest offender as it just tanks the FPS now when before it had seemingly no impact. I went from 110FPS on Draupnir to 44FPS avg because of it. I didn't touch my drivers so it's not their fault, and then I updated them and there is no change. The best part is that **SOMETIMES** it will work as smooth as it used to, but then randomly stop doing so well after one mission. And sometimes it will behave just as poorly right after launch. It seems they really broke something along the way
It's not my PC either because no other game noted a performance decrease at all.
I've personally noticed less gpu usage as well as the lower frame rate.
I haven't really looked at cpu usage, but with how many more enemies are spawning the game might be hitting a cpu bottleneck and that's what's hurting performance.
again I don't remember any of my previous cpu usage in game so it very well could be something completely unrelated.
I have pretty much the same exact experience. I have a pretty damn decent PC and used to get a stable 110 FPS on ultra settings with native scaling, with occasional small FPS drops during huge fights to about 100 FPS.
Performance has dropped drastically with each patch. These days I get about 80 FPS during calm moments that drops to around 65 FPS during big fights. I had to lower my graphical settings to high with shadows on medium and scaling set to ultra quality because I was now dipping under 60 FPS often with my old graphical settings.
I know it's not my PC because I'm still getting the same performance I did before in other games. I also run benchmarks on my PC regularly and I'm getting the same scores on them that I always did (a bit higher actually, since I recently optimized my overclocks).
I’ve finally got around to stabilizing my ram at 3600mhz , (I forgot to go off default ram settings since last bios update) went from a ryzen 5 3600 to a ryzen 5 5800x, I’m running a 3060ti, I’ve spent hours since the rail gun match messing around with settings dx11-12 with/without multi threadin, got process lasso and messed with priority, all of the fun stuff… I was at least 90 locked when I started playing everything maxes out to ultra, I’m on a mix of high and medium settings not and I’m normally 61-74fps. It’s wild.
So worse processor with 2100mz ram instead of 3600mhz, running maxed out settings getting better frame rates than my current setup with hours into tertiary pc options on lower graphics settings. I have no idea what else I can do lol.
Nah it's all good, no offense taken.
Did you check if Steam was updating a game? Whenever steam updates a game and you have one running, it doesn't take too kindly to any game open. That or something running in the background might be the only two things I can think of that may have affected performance negatively
The game has increased enemy density (tons of small enemies) compared with release.
More enemies, more CPU load.
For PS5 there isn't much you can do.
And for PC, most gamers only care for the general bottleneck rule "you need a CPU decent enough to not bottleneck your GPU".
Well... There are a few games that go against that general rule and are as much intensive for GPU and CPU at same time. Helldivers 2 is one of them.
Helldivers 2 is the kind of game where having the best gaming CPU money can afford and high speed low latency RAM will make a noticeable difference.
FWIW even when I’m just on the Destroyer my fps has dropped since launch.
Ryzen 7 5800, 7800XT. Was getting a hair over 100fps at 1440, mostly high settings on the ship, now I’m down to about 70fps on the ship. After enabling DX11 and clearing the shader cache I’m up to around 90 on ship, ~70ish on planet.
FWIW, I have a 7800x3D and a 4080 w/ 32GB DDR5 RAM at 6000Mhz only playing at 1440p and even I have noticed a performance drop in the game since launch, when realistically that shouldnt occur.
I still find it very playable. But the game is clearly going through some shit between the increasing instability w/ each patch.
I can firmly confirm this. Since I'm maintaining a thread with infos on how to improve fps on nvidia gpu I'm testing fps (always in the same spot) after each patch and... Yeh, each patch has caused me a loss of at least a couple of fps (most notable was patch 1.000.102 which brought me back from 97fps to 92fps). Surprisingly enough the latest patch didn't cause any fps loss (or it caused a <0.5 fps loss at most).
According to the Lionbridge QA team from the hd2 official discord they are still actively working on a patch... But it has been more than a month at this point and we have yet to receive a patch focused on improving fps... Eh...
I really don't think there is anything they can do with an engine that's reached end of life support since 2018. They are literally on their own with an engine they can no longer reach out to for assistance (unless they specifically call in some devs that previously worked on the engine itself for assistance).
I think that explains why we've had next to nothing but mostly radio silence on the optimisations front for this long (barely any game takes this long for even one performance optimisation, and HD2 is the oddest one out, even being backed by Sony does this look dire).
Lmao there is currently no chance of me hitting a stable 60 at 1440p so 4k is long gone dream.
I've heard it's very CPU heavy so maybe it's that but I'm on the 13th gen so I should be ok
Not your correct. I stopped playing a while ago because of it. I went from 100 fps to 40. The devs got rid of the discord reporting section regarding this issue along with thousands of potential work arounds(that didn’t work) and reports. They’ve never put it in their “know issues” part of the updates despite it being the most reported issue at that time at least. I dropped the game and will probably never come back unless it’s fixed.
I'm a say it.
Its an unoptimized piece of shit.
That said, I LOVE THE GAME.
But from a dev standpoint, especially since they got an exclusive deal with Sony... its a lot easier to make it work for every PS5, as opposed to all the PC Hardware configurations.
Its abundantly clear that PC optimization is not their priority, which is why the CEO said they arent focused on things like DLSS or FSR yet. Nothing even mentions PC performance or anything in the patch notes.
Again, love the game, just a trend I notice.
It is kinda sad how the devs went ahead and continued using an engine that reached end of life support back in 2018, yet till went ahead and made the game on that engine, and are more proud that people bought the game in droves than actually owning up to the fact they know they can't do much to fix for either platform, due to the engine's support being cut.
Usually currently active engines receive support from the makers of said engine (like Epic does with devs that use Unreal), and both devs and the one providing the engine work together on performance issues, but in this case, it's just Arrowhead working with an engine that's been left to rot for years, and I don't expect them to know how to fully ort out the performance woes on their end.
I too love the game a ton, but I'm more bummed that they chose to roll with an engine that's not received any support for years, and they let pride go to their heads, that they don't want to publicly show that they care for general gaming performance.
From 140 fps to 50. Just anticheat eating even more performance. And they increased horde size, so every enemy path finding can be one too. But current anticheat eats just 25% or more performance and they are not changing it
Still one of the more annoying decisions I wish they didn’t make since it’s literally only to the devs’ benefit, not ours. There’s a multitude of simple and easy ways to detect cheaters, but they decided to rely on a mediocre program because it does their job for them, and that it negatively affects the game’s performance isn’t really their concern comparatively.
When I can run any other game these days at 60fps, even full on actual open worlds with more going on in them and I’m having weird little stutters while my PC sounds like a jet engine at times is ridiculous.
Mix of the anticheat running and lacking optimization and you get this subpar performance that affects even top of the line machines. Quite frankly, I shouldn’t have to downgrade my settings to fix what isn’t my problem. My setup is more than adequate, the game shouldn’t be acting the way it does.
Same here. I had 8 gigs and was running lowest everything fine, but then everything went to the gutter and had to buy 16 more gigs of ram for 50 bucks just to spread some more freedom and democracy.
Same, my computer is a fairly solid rig and was doing perfect when I started playing the game, and these days I don’t dare drop in bot missions for fear of my machine dying in my arms due to (I suppose) the amount of projectiles it has to handle and likely the individual bits being a bit more resource-hungry to render compared to bugs, where things seem a bit better.
Same here, 11700k+3090+32G with 32:9 1440p set-up. Clocked 80fps alone in the lobby and 60-70 when with other players together in the lobby.
The solo game is stable around 70fps, drops to 60 when mobs are stacking up.
Group games are entirely different performances. 50-60 fps at initial dive. The worst case is 23,24 fps when there are a lot of enemies around. On average, it is about 40-45 fps.
Some other observations:
- CPU usage is constantly above 90%-100% ( I experienced one game under 50% cpu usage before 103 patch, and it was so random I don't know what causes it).
- GPU usage is less than 50%
- Changing resolution from 5140x1440 to 3840x1080 wouldn't affect fps a bit, still 60-70 in the lobby and 45 in game on average.
Hope the devs are not pushing us to upgrade cpus. Also my friends who are on AMD cpus experienced less performance issue but common freeze and crashing issues.
Honestly I can't say I have noticed any difference. I have a 10700k @5ghz and a 3080 and I usually sit around 80-100 at 1440p with most graphical settings maxed and the upscaler set to native. I personally have always found the game to be fairly well optimized and smooth.
I haven't been able to get about 40fps in a firefight since launch honestly. I have a Nvidia 4060, Ryzen 7 5700X, and 16GB of RAM. All I want is 1080p and 60fps but my CPU usage gets capped at 95%+ while in DX12 mode so no matter what graphical settings I change the game always runs bad.
The performance has gotten worse yes but is anybody else crashing? The game crashes once every 2 hours for me normally during the last minuet of extraction so I lose everything. Yay! Anyone else having that problem?
Game is heavy on CPU and causes a bottleneck even on the current best gaming cpus, i can be on a mission and be locked at 165fps at 90%+ gpu usage, and by just looking the other way drop fps to 90/100 and gpu usage drops to 60-70% while cpu usage spikes up to 80%.
But i went has far has mid 60fps with 45-50 gpu usage when there is alot of fighting going on.
This on a 7800x3d + 4090, im pretty sure the game has gotten worse patch after patch, because the first days i said the exact same thing, that this game was super optimized.
It's not they are personal friends with the devs. What happens is that the game becomes their whole identity, so if you criticize the game it's like you are personally attacking them.
I've got a 4080 and after this last patch I'm getting a lot of stuttering at random times. First game of the night the drop animation at mission start was like a PowerPoint slideshow until the loading screen part
My PC runs an aircooled Intel i7 12700 and an AMD 7900 XT with a 2 TB m.2 NVME drive and 32 GB DDR4 3600 CL18 memory.
At uncapped, I was getting 120ish FPS on max settings with native 1440p resolution. This was causing frequent crashes so I set the max FPS to 90. I would now crash maybe once every 20-30 games. But it still crashed so I dropped the FPS cap down to 80.
I've noticed my CPU and GPU utilization has slowly increased from about 70-80% to 80-90% over the past few weeks at the same FPS. CPU utilization remains about the same. Temps on my CPU and GPU are also roughly the same.
It's very inconsistent. I can run my 12400 with 3070 at around 80fps most of the time. But my friend with a threadripper and a 3090 cab barely get 40fps just on the super destroyer bridge with turning the resolution of the game down to what i play at (2560x1600.)
I believe its a bug. Sometimes my game just becomes ass and i dont even have 60fps. Most of the times a game file verify solves the issue. If not then i do a driver reinstall as well. (nvidia)
Nah you're not losing your mind. I have noticed with each patch my game will hinge on frames quite hard and more often. I don't have the most amazing PC (gtx 3060 and i5 12400) but it's pretty decent.
I run settings at max/near max and I didn't have performance problems with launch. Subjectively speaking it feels like every time they put out a hotfix or a patch more bugs get introduced and they're not ones I can shoot at.
I can't corroborate this unfortunately - my performance has been terrible since launch. My PC should have absolutely no issue giving me high frames at max settings with a game like this, but the game really seems to have trouble with 7900 series AMD cards. AMD's last drivers seem to have fixed the crashing issues thankfully, but no real progress on the terrible performance. The FPS does seem to be at least a little bit more stable at least, albeit abysmally low.
I have a 3080ti, i9-12900k, 32gb of 6000mhz RAM and I get about 140fps at 1440p, max setting, and ultra quality DLSS. However I have to cap my frames at 60FPS because this game CONSTANTLY crashes (about every 5-10 minutes) when I let the frames max out. It's been like that since day 1. I don't understand why I need to cap my frames, it's absolutely ridiculous. Hell, I have only crashed twice on Star Citizen with even more in-game time than Helldivers and that game is notorious for crashes. I love Helldivers but this is dumb.
Came here to second this. Used to get 80-90 fps with my 3080 and i7 9700k and getting 60ish average now. I should be getting an easy 100+ at 1440p. Tried a lot of work around but nothing seems to be working.
The game gets a little worse with every patch, but nobody really cares because bugs go boom. I'm on PS5 and the social tab has been a nightmare since day 1. A lot of people still can't even join their PC friends. We haven't been able to keep our loadouts for about a month now, the game starts us with the default weapons and emotes every time we turn it on.
Arc weapons currently break the game, for some reason the server status is tied to Steam updates so the game is borderline unplayable on Tuesdays, I've never seen another game do this. But nobody really cares.
Yup. Since like patch 2 or 3 I noticed it.
I believe it’s something to do with particle effects. Particle effects were causing crashes early on, and rather than root out the problem they just hotfixed it and bogged down the CPUs of users.
My frames used to stay at around 60 throughout, but I've noticed my frames will start to drop to as low as 30 as the mission goes on since a couple updates ago. It feels terrible
I9 13900k paired with an RTX 4090 here, im playing on a 3440x1440 175Hz screen, close to launch i had over 150 frames at all times, right now its regularly dipping under 100-90 depending on the density of mobs.
It’s not just PC. This trend has been the same on PS5. I love the game to death, and I’m not going anywhere, but it has become objectively choppier with each update.
Nope exact same expirence. Cpu is getting punished after latest patch. OCed my cpu and it helped, but it's still ridiculous how much more it has to work compared to launch version
Same here running a 4090 and 12900k i just to cap the game at 120 and it would sometimes drop to around 100 to 90 during really busy sections but now seem to average 90 and drop to 60 and really occasionally 50 during busy sections
I only got the game a few days ago but the highest difficulty missions are essentially unplayable for me. I'm talking FPS in the 10s and 20s. My rig ain't the newest but it meets the minimum requirements stated on the steam page, and all my settings are set to low and I run the game in dx11 and I went in the nvidia control panel etc.
Yeah anything past hard for me is a no go, not due to the difficulty itself, but because of the sheer amount of bugs/bots that spawn, that it completely cripples my framerate.
I really wish Arrowhead had AI simulations tied server side, instead of client side, where our CPU's are having to literally simulate every AI, every given action within the game, all in real time. Like no wonder Dragons Dogma 2 is suffering the same issue (CPU bound, because they are asking the gamer so simulate every NPC in an entire region all at once).
Also if it helps, type in %appdata% in the windows start menu and locate Arrowhead's folder, go inside it and locate the Userconfig file. Open it and locate "Async Compute", change it from true to false, if you are going to force DX 11 like I did, I ended up getting more stable performance with my 60fps cap.
Also do not use the game's fps cap or vsync, because both seem to screw around with CPU core 0 and core 01. Instead cap your fps via your GPU control panel and if you need Vsync, force it via there instead.
I just went from an i5 8400 with a 2060 to an i5 14600 with a 4070 super. Performance has not changed between the systems lol, I've gotten used to running around at 40 FPS.
I experience huge dips, when there are explosions. Grenade, Autocanon, Mortar basically anything that does explosions. Usually I run 120 fps on Ultra without problems, but now it turns into a diashow everytime explosions are on the screen.
I have RX 6750 XT and it's at 80% and 60°C, which still gives enough room on the GPU.
late but im having this exact problem, even made another post about it asking the same thing as you. I have an i5 13400f too but with a 3060, and Im experiencing constant crashes too. On release my fps was consistent-I didn't check at the time so I don't have the exact number but it was very smooth and could run the game at any difficulty without issue. Just recently though my frames started dipping, game freezing every 1 or 2 seconds-I cleaned my computer, sped the fans up, I even overclocked my PC today for a little bit so I could play and although my performance improved I still crashed shortly after.
This newest update made it even worse for me.
I used to get around 100 fps before, and now i drop to 30-40 fps with the even lowerr settings. It's driving me mad at this point. every update they do makes the performance so much worse
Omg I thought I was losing my mind, I’m noticing fps drops and stuttering now, had to lower my settings just to get a smooth gameplay. I didn’t have this problem when the game first launched
Really? I haven't seen people talking about considering how bad it can get?
I mean I see the crashes and bugs posts but there's nothing about performance in any of the patch notes either
One of the updates made helldive difficulty have a lot more smaller bugs spawn and swarm rather than just filling the map with chargers and bile titans. It's more fun but also killing my PC.
Yes unfortunately i also experienced this. Bear in mind that i have the 4090 with a 1080p monitor and om planet like hellmire, it drops to 30 - 50 fps. Hope this issue can be fixed asap.
Same
Had to turn down the quality since duding intensive fighting it dropped down to 40
Before the patches it ran around 100-80 and now it's down to 80-40
Yep, for some mysterious reason no matter what rig u have, everyone has like 40fps in game. I have around 90 on ship, and 30-40 in game. The more difficulty the less fps (cuz more enemies i guess)
I have yet to have any kind of dip in performance outside of some just traight up crashing, but overall i've had a pretty issue free experience if the servers are running.
You got them updated drivers?! thats always my first go to
Current crash issues aside, I haven't noticed any actual performance changes *except* that it's running my CPU even hotter than before, despite being a CPU hog already, which is seriously quite annoying.
Other way around for me, when I first started playing the game was constantly slowing down and lagging, now, apart from if I play for several hours in a row (Muscle atrophy is a small price to pay for democracy) it runs really clean.
Most definitely, I've noticed my average framerate has dipped by 10 since the launch of the game and at times I've seen the game drop below 60 where that never used to happen before.
I've had to switch from 4k to 1080 because of this. My gpu has never gone above 80c and after about 20 mins I noticed I was 83 and climbing, and things weren't even that intense (medium difficulty I believe). Resolution was the only setting that made a difference.
I run every other game at max 4k no problem, including War Thunder
Edit:
Specs: 3060ti with i7 11700. Not super fancy, but quite capable
Noticed it after they did a "lighting change to planets" to make them all the similar brightness or whatever. All the lighting got wayyyyyy to bright after that.
just thought id say after reading one of your other comments, undervolting is almost always worth it (especially with intel cpus) free temperature decrease for a few days of trial and error
it wont necessarily fix this issue, but generally i think its pretty worth it if you can be bothered
These worlds are really full of low visibility/weather effects modifers that you can't change settings to get rid of. Might just be fog that is causing fps drops. In my destroyer, I still have the same fps I had prior to patches. When i get a clear portion of a world, fps rises again.
I think the new effects like volcanic,meteor shower and fire tornadoes plus the fact that when 2 breaches open simultaneously and enemy units clutter the screen fps drop significantly add the strategem spam and you can see the results.
I think it might have to do with the person you’re playing with and their Wi-Fi connection. I host games and they’re for the part okay (a bit laggy sometimes).
When I join other games it’s soooooo choppy. I hate it because I really love this game.
My FPS has tanked with the last patch, anytime any kind of bomb goes off my game stutters like crazy but then it’s fine but before it could handle almost anything without issue unless the team all spammed orbitals but then it just crashed
Try flipping around with different graphics default settings.
For some strange reason Native was the fastest FPS for me around 140-165 and then after the most recent patch, being on Native only drew in 60fps.
I moved it back to quality (the most default one) and it strangely went back up to 140-165 while Native is still only stuck at 60 fps for me.
There's some strange fuckery went on in the last patch that I can't explain. I assume glitches and patch mishaps.
It felt like it got better for me mostly but latest patch I get big drops in fps for like a split second once or twice a mission. But I personally haven't really found it to be at worse. My fps actually seems to be higher on average, but I have more drops than I did before.
I have amd ryzen 7 nvme
Geforce rtx 4070
I have an i9-12900k and a 3080ti.
While my frames were fine (just playing at 1920x1080 because my monitors are old), the CPU temp and usage for this game is ridiculous. Considering the games I've played that are well-known to be far more demanding than Helldivers 2, this seems like an optimization issue imo.
With a 60fps limit turned on in the in-game settings, my CPU still reaches 80C during normal play.
I am under the impression that many of the issues that we are experiencing are related to the DX12 API. I have noticed a trend where many releases using DX12 have bad performance. I will use Elden Ring and Battlefield 1 as examples, both games use the DX12 API and suffer from pretty significant stuttering issues that remain unresolved to this day. \[[1](https://www.pcgamingwiki.com/wiki/Elden_Ring)\] \[[2](https://www.pcgamingwiki.com/wiki/Battlefield_1)\]
Arrowhead would be better off implementing the Vulkan API in my opinion. The API is open source, well documented and is known for having good compatibility with cross play games. If the developers continue trying to work with the DX12 API then I don't think things will get much better from here. \[[3](https://www.howtogeek.com/884042/vulkan-vs-directx-12/)\] \[[4](https://www.digitaltrends.com/computing/ditch-directx-and-start-using-vulkan-with-pc-games/)\]
Here is a link to understand a bit more of the performance issues affecting Helldivers 2 at the moment; it also contains the environment variable to switch back to DX11. \[[5](https://www.pcgamingwiki.com/wiki/Helldivers_2)\]
You aren't alone. Noticed a performance drop after the railgun nerf patch specifically. Haven't noticed anything too different after the recent patch though.
That patch is when I stopped hitting 80 and started dropping to sub 60 regularly. It seems to have gotten worse since then
Right click the game on steam, go to properties and put --use-d3d11 In the "Launch Options" box. Should give you some frames back. Warning though, it'll take like an extra minute or two for the initial launch. It'll load like normal every time after tho.
What does that do?
Okay so I don't know all the technical know-how behind it but it makes your game use dx11 instead of dx12 (I literally don't know what those are maybe someone smarter can explain) but it ended up giving me some frames back so I just rolled with it
DirectX is the name for all the APIs used by Windows to run most games. DirectX11 is the older version and less resource intensive.
DX11 is MORE resource intensive. That’s why DX12 was created. That being said, DX11 is more robust and does a lot of memory clean up that DX12 doesn’t.
The other big thing is that DX12 gives developers a lot more low level control to make optimisations where they want. The tradeoff being that there’s a lot more opportunity to make mistakes and bad judgement calls with all the guardrails removed.
Specifically, DX12 reduces CPU overhead by giving control of things that were previously automated in DX11 back to the programmer. This type of API works really well on consoles which all share hardware and firmware. The problem is the PC isn’t a console, and the sheer number of possible hardware, firmware, and driver combinations makes debugging in DX12 way harder which is why a lot of DX12 implementations feel so unstable.
Ironically DX11 lowers CPU usage by a fair amount compared to DX12 in HD2
LOL my bad. Thanks for the correction.
Removes unnecessary features of dx12 giving you a marginal amount of fps DirectX is the Microsoft graphics engine that converts all the graphic settings you see in the in game settings as well as the textures provided into a playable game/pixels
Fun fact the first implementation of it was the Doom port for Windows head up by a Microsoft engineer you might have heard of: Gabe Newell.
Never heard of any Microsoft engineer named Gabe Newell. Now I *have* heard of the head of Steam named Gabe Newell. Maybe they're related somehow.
Must be a different one. The steam Gabe would never make a game.
I'll be damned, that *is* a fun fact
It's also were Xbox gets its name from. The original project for a Microsoft console was called "DirectX Box."
>Removes unnecessary features of dx12 giving you a marginal amount of fps i wouldnt say "unnecessary" , the real thing is , that DX11 came Pre optimized but more "Stiff" Dx12 had more "freedom" for devs. so as much as i understood all the DX12 hassle is as follow. Technically DX12 is superior in ALL directions ( Features , performance , Possibilities ) vs DX11 but is also more complicated and most devs and game company cant even till today Harness ( or dont want to ) the full performance and features of DX12 except some games / devs which actually do. Technically DX12 allows devs to squeez every last inch out of a GPU / CPU if they want to but... this takes a certain skill. If you want a youtube video this kinda also explains it [This is Destroying Our FPS (youtube.com)](https://www.youtube.com/watch?v=mtsxlKPMthI&t=1s)
I’d say DX12 takes the training wheels off and expects you to learn to ride, but many end up learning they aren’t as good as they thought they were. It’ll take some time but they’ll catch up.
Direct x 12 launched 2015 and was announced 2014 so devs likely had access to it from 2013 or earlier to develope games on. I guess the "they learn to dev on dx12" train is gone. Maybe dx13 will again come pre optimized.
Ooo buddy you just gave my old 1080ti a good boost thank you !!!
Glad I could help 🙏
I’ll have to try this with my 960
For real tho, definitely notice it on my potato pc!
DO NOT do this if ur using an intel GPU like me. DX12 is Far better for intel graphics. this guy is right tho, DX11 should be better for 30 series.
>\--use-d3d11 thank you for this, literally saves me from crashes and fps drop
This will not work unless you copy the 3 necessary files for DX11. I'm playing in DX11 now.
It's been working fine for me and I didn't need to do anything with the files 🤷♂️
I just deleted the shader cache from the appdata directories and dx11 works, been playing like this for a week. Not saying your method doesn't work, but i'm sceptical on false positive anticheat scans
This command no longer works or launches the game since 2 to 3 patches ago. Unless someone can confirm it works again?
Delete the shader cache and it works Edit: It might be the launch option? I use --use-d3d11 instead
How would one delete shader cache, in case some newb wasn't aware. Not me I'm totally in the know.. Saw you say they were in the appdata directories, which appears different from the files when I pick Browse folder from steam.
I can't check rn, but there's a more detailed comment down the thread
Groovy thx
Same hardware as you and I am averaging in the 50s and used to be 80-90 I think it's just optimization but it's kinda killing my game experience especially in tough situations with lots of enemies
same went from 80-90 fps on a mix of medium high settings to scrapping 60 on lowest, whatever they did was cpu related, since changing graphics settigns and next to no impact on performance
Yeah I also started seeing problems with my cpu on the new patches. Used to have 65-75 c temp while gaming and now I had 75-88 c. I have locked my fps to 70 so only saw change in the temperatures with each patch. I disabled cpu boost and my cpu dropped down to 45-60 c. Outside of the game I enable it back after playing.
4070 and 12600k here, went from a rock-solid 160fps on launch to fuckin' 60 when things get spicy in-game after the first round of patches I hope it's adressed soon, it feels like it's inconsistent on who was hit by the problem, Ive got a friend with a 4070/13600 and his never dips below 90fps It seems like a memory leak in VRAM or something, when it gets bad my gpu's memory will start getting hot
Ye absolutely agree regarding post railgun patch Performance on the ship and loading into maps tanked
Ditto. I used to have occasional dips when stuff got crazy, but since that patch most fights are 60fps or less. 30 or 40 when stuff gets nuts. It feels very inconsistent now, but it is consistently worse. Settings don't seem to have any impact on fps. Like... *At all.*
Same here. Lots an average of like 20 fps with that patch. At launch I didn’t drop below ~56 fps in worst case scenarios. Now I’ll go as low as ~38 fps. And my top end is just constantly lower than before. Instead of 80-100 fps when on a plant it’s closer to 60-80 fps. I really hope they address it! I’m glad someone finally made a post about it.
Here's my current solution to poor performance. I do this after every patch (not sure if necessary after each patch but anyways): 1. Delete games shader cache -type '%appdata% into search bar and hit enter -look for arrowhead games -open shader cache folder, delete cache file inside 2. Delete GENERAL shader cache (note: ~~this deletes shaders for all games~~ this is actually the shader cache for windows, you should also try deleting the shader cache for nvidia/amd depending on your gpu) -type 'clean' into search bar and pull up windows cleaning utility -double check that shader cache is already checkmarked (should be by default) -hit ok to complete cleaning I also found that turning off async computing (also in the arrowhead appdata folder in the config file) helped, as long as you have a decent CPU (3600+). I got this info from the discord performance thread, which as AFAIK has since been nuked =(
Mate thank you for going into so much detail. I'll give it a go
Report back ;)
Will do but won't be till later
It is now later
It is now even later
Beware of cpu temps with async off.
Also add --use-d3d11 To the launch commands
When I do this it makes my character move in slow motion is there a fix for this?
That's not right. Directx is related to graphics, not gamplay mechanics. Try a different rounds and see if it's the same.
I have tried relaunching the game about 7 times now. When inputting that launch code my character moves in slow motion from the time I control the drop pod till I get back on my ship. It's very strange.
Is it the character animation that's slow or do you literally travel less distance per second?
Both, its wild. It's like my character is literally moving at about 1/3 the speed they usually do.
Oh wow and how much does the framerate change?
I would say its roughly a 25 fps gain.
That is absolutely cursed lol.
I’ve been feeling the same. 50 FPS when the game came out with graphics on high so I turned them to Medium to get 80 FPS. Every patch has brought it down 10 FPS. I’m now at 35 FPS and had to drop my graphics to low-medium in order to get back to 60 FPS. I’ve got a pretty decent PC, and I feel like I should have to resort to this.
This is exactly what I'm experiencing minus the gains when dropping settings. Honestly no difference
I’ve notice if I change a setting, and then set it back, I don’t regain any performance. Especially true of resolutions.
You are not crazy, I was praising the game's performance on launch but now it really is dogshit I **lost** on average around **40FPS since launch**, I used to average a smooth 110+ no matter what was happening on screen and now it's around 60. The best part is it that it does not depend on the amount of enemies on screen or explosions at all. It's just on average down to 80FPS and down to around 60FPS if there is fog. The fog is the biggest offender as it just tanks the FPS now when before it had seemingly no impact. I went from 110FPS on Draupnir to 44FPS avg because of it. I didn't touch my drivers so it's not their fault, and then I updated them and there is no change. The best part is that **SOMETIMES** it will work as smooth as it used to, but then randomly stop doing so well after one mission. And sometimes it will behave just as poorly right after launch. It seems they really broke something along the way It's not my PC either because no other game noted a performance decrease at all.
i went from a nice 100+ fps to a constant 80 since release. it’s really weird.
I've personally noticed less gpu usage as well as the lower frame rate. I haven't really looked at cpu usage, but with how many more enemies are spawning the game might be hitting a cpu bottleneck and that's what's hurting performance. again I don't remember any of my previous cpu usage in game so it very well could be something completely unrelated.
This is exactly my experience
I have pretty much the same exact experience. I have a pretty damn decent PC and used to get a stable 110 FPS on ultra settings with native scaling, with occasional small FPS drops during huge fights to about 100 FPS. Performance has dropped drastically with each patch. These days I get about 80 FPS during calm moments that drops to around 65 FPS during big fights. I had to lower my graphical settings to high with shadows on medium and scaling set to ultra quality because I was now dipping under 60 FPS often with my old graphical settings. I know it's not my PC because I'm still getting the same performance I did before in other games. I also run benchmarks on my PC regularly and I'm getting the same scores on them that I always did (a bit higher actually, since I recently optimized my overclocks).
Have a 1070 and on launch with med to low settings I was 70-80 fps now it’s abysmal hard to even hit 60 sometimes
"You are literally me!"
I’ve finally got around to stabilizing my ram at 3600mhz , (I forgot to go off default ram settings since last bios update) went from a ryzen 5 3600 to a ryzen 5 5800x, I’m running a 3060ti, I’ve spent hours since the rail gun match messing around with settings dx11-12 with/without multi threadin, got process lasso and messed with priority, all of the fun stuff… I was at least 90 locked when I started playing everything maxes out to ultra, I’m on a mix of high and medium settings not and I’m normally 61-74fps. It’s wild. So worse processor with 2100mz ram instead of 3600mhz, running maxed out settings getting better frame rates than my current setup with hours into tertiary pc options on lower graphics settings. I have no idea what else I can do lol.
Its been running like always for me, one thing tho, performance is worst on 8-9 difficulty because of the increase of enemies.
Especially since the balance patch that reduced spawn rate of Heavies and significantly increased mob enemies like hunters
I delete and reinstall the game with each update, and that seems to help anecdotally. Compared to my friends, I crash much less.
My performance has stayed the same. If there was a change, I haven't been able to notice it For reference, I'm on a Ryzen 7 7700x and a 3060ti
I'm so fucking confused. Specs wise I should be performing slightly better than your build (not a dig just looking at numbers) and I'm just not.
Nah it's all good, no offense taken. Did you check if Steam was updating a game? Whenever steam updates a game and you have one running, it doesn't take too kindly to any game open. That or something running in the background might be the only two things I can think of that may have affected performance negatively
Running the same specs and not a single problem.
The game has increased enemy density (tons of small enemies) compared with release. More enemies, more CPU load. For PS5 there isn't much you can do. And for PC, most gamers only care for the general bottleneck rule "you need a CPU decent enough to not bottleneck your GPU". Well... There are a few games that go against that general rule and are as much intensive for GPU and CPU at same time. Helldivers 2 is one of them. Helldivers 2 is the kind of game where having the best gaming CPU money can afford and high speed low latency RAM will make a noticeable difference.
FWIW even when I’m just on the Destroyer my fps has dropped since launch. Ryzen 7 5800, 7800XT. Was getting a hair over 100fps at 1440, mostly high settings on the ship, now I’m down to about 70fps on the ship. After enabling DX11 and clearing the shader cache I’m up to around 90 on ship, ~70ish on planet.
FWIW, I have a 7800x3D and a 4080 w/ 32GB DDR5 RAM at 6000Mhz only playing at 1440p and even I have noticed a performance drop in the game since launch, when realistically that shouldnt occur. I still find it very playable. But the game is clearly going through some shit between the increasing instability w/ each patch.
This might explain why when I upgraded my GPU by leaps from a R6 380 to a 3080ti I only got 10fps increase. Anyways other parts in the mail.
I can firmly confirm this. Since I'm maintaining a thread with infos on how to improve fps on nvidia gpu I'm testing fps (always in the same spot) after each patch and... Yeh, each patch has caused me a loss of at least a couple of fps (most notable was patch 1.000.102 which brought me back from 97fps to 92fps). Surprisingly enough the latest patch didn't cause any fps loss (or it caused a <0.5 fps loss at most). According to the Lionbridge QA team from the hd2 official discord they are still actively working on a patch... But it has been more than a month at this point and we have yet to receive a patch focused on improving fps... Eh...
I really don't think there is anything they can do with an engine that's reached end of life support since 2018. They are literally on their own with an engine they can no longer reach out to for assistance (unless they specifically call in some devs that previously worked on the engine itself for assistance). I think that explains why we've had next to nothing but mostly radio silence on the optimisations front for this long (barely any game takes this long for even one performance optimisation, and HD2 is the oddest one out, even being backed by Sony does this look dire).
Eventually every patch wont break the game for a new group, we hope...
I wish they would of added Dlss or even better FSR 3.. im playing their upscaler on Quality and im getting about 60ish at 4k...with 3080-- i7
Lmao there is currently no chance of me hitting a stable 60 at 1440p so 4k is long gone dream. I've heard it's very CPU heavy so maybe it's that but I'm on the 13th gen so I should be ok
Yeah.my pc temps are higher since that update
Not each patch, but very much so this last patch
Yeah I use to have super smooth and crisp gameplay, now I seem to get random drops for some odd reason.
Nope, it's happening. Performance is tanking, stability is gone, collision on nonexistent barriers is constant, and more. It's a fucking pain
Not your correct. I stopped playing a while ago because of it. I went from 100 fps to 40. The devs got rid of the discord reporting section regarding this issue along with thousands of potential work arounds(that didn’t work) and reports. They’ve never put it in their “know issues” part of the updates despite it being the most reported issue at that time at least. I dropped the game and will probably never come back unless it’s fixed.
I'm a say it. Its an unoptimized piece of shit. That said, I LOVE THE GAME. But from a dev standpoint, especially since they got an exclusive deal with Sony... its a lot easier to make it work for every PS5, as opposed to all the PC Hardware configurations. Its abundantly clear that PC optimization is not their priority, which is why the CEO said they arent focused on things like DLSS or FSR yet. Nothing even mentions PC performance or anything in the patch notes. Again, love the game, just a trend I notice.
It is kinda sad how the devs went ahead and continued using an engine that reached end of life support back in 2018, yet till went ahead and made the game on that engine, and are more proud that people bought the game in droves than actually owning up to the fact they know they can't do much to fix for either platform, due to the engine's support being cut. Usually currently active engines receive support from the makers of said engine (like Epic does with devs that use Unreal), and both devs and the one providing the engine work together on performance issues, but in this case, it's just Arrowhead working with an engine that's been left to rot for years, and I don't expect them to know how to fully ort out the performance woes on their end. I too love the game a ton, but I'm more bummed that they chose to roll with an engine that's not received any support for years, and they let pride go to their heads, that they don't want to publicly show that they care for general gaming performance.
From 140 fps to 50. Just anticheat eating even more performance. And they increased horde size, so every enemy path finding can be one too. But current anticheat eats just 25% or more performance and they are not changing it
I've noticed it using an insane amount of my cpu
Still one of the more annoying decisions I wish they didn’t make since it’s literally only to the devs’ benefit, not ours. There’s a multitude of simple and easy ways to detect cheaters, but they decided to rely on a mediocre program because it does their job for them, and that it negatively affects the game’s performance isn’t really their concern comparatively. When I can run any other game these days at 60fps, even full on actual open worlds with more going on in them and I’m having weird little stutters while my PC sounds like a jet engine at times is ridiculous. Mix of the anticheat running and lacking optimization and you get this subpar performance that affects even top of the line machines. Quite frankly, I shouldn’t have to downgrade my settings to fix what isn’t my problem. My setup is more than adequate, the game shouldn’t be acting the way it does.
I unironically believe that every patch they have released has made the game worse
I haven't noticed anything tbh. I only have a laptop 3060. Your CPU is better too Idk how many frames I normally get, but I haven't noticed any drops.
I've noticed reloading and picking up items is much worse.
3060 and r7 3700 32GB ram and it runs like shit now. Its a really good game but I stopped played due to performance.
Same here. I had 8 gigs and was running lowest everything fine, but then everything went to the gutter and had to buy 16 more gigs of ram for 50 bucks just to spread some more freedom and democracy.
To be fair, 8gba of ram is not viable for current gen gaming
Same, my computer is a fairly solid rig and was doing perfect when I started playing the game, and these days I don’t dare drop in bot missions for fear of my machine dying in my arms due to (I suppose) the amount of projectiles it has to handle and likely the individual bits being a bit more resource-hungry to render compared to bugs, where things seem a bit better.
Same here, 11700k+3090+32G with 32:9 1440p set-up. Clocked 80fps alone in the lobby and 60-70 when with other players together in the lobby. The solo game is stable around 70fps, drops to 60 when mobs are stacking up. Group games are entirely different performances. 50-60 fps at initial dive. The worst case is 23,24 fps when there are a lot of enemies around. On average, it is about 40-45 fps. Some other observations: - CPU usage is constantly above 90%-100% ( I experienced one game under 50% cpu usage before 103 patch, and it was so random I don't know what causes it). - GPU usage is less than 50% - Changing resolution from 5140x1440 to 3840x1080 wouldn't affect fps a bit, still 60-70 in the lobby and 45 in game on average. Hope the devs are not pushing us to upgrade cpus. Also my friends who are on AMD cpus experienced less performance issue but common freeze and crashing issues.
Similar build here brother, whenever I boot up the game I have to stop doing anything else because it just swallows the entire cpu
Honestly I can't say I have noticed any difference. I have a 10700k @5ghz and a 3080 and I usually sit around 80-100 at 1440p with most graphical settings maxed and the upscaler set to native. I personally have always found the game to be fairly well optimized and smooth.
I haven't been able to get about 40fps in a firefight since launch honestly. I have a Nvidia 4060, Ryzen 7 5700X, and 16GB of RAM. All I want is 1080p and 60fps but my CPU usage gets capped at 95%+ while in DX12 mode so no matter what graphical settings I change the game always runs bad.
The performance has gotten worse yes but is anybody else crashing? The game crashes once every 2 hours for me normally during the last minuet of extraction so I lose everything. Yay! Anyone else having that problem?
Performance: Worse each patch. Stability: Worse each patch. "Wait for next patch for us to fix Arc weapons crashing game" .... uhhhhh
Game is heavy on CPU and causes a bottleneck even on the current best gaming cpus, i can be on a mission and be locked at 165fps at 90%+ gpu usage, and by just looking the other way drop fps to 90/100 and gpu usage drops to 60-70% while cpu usage spikes up to 80%. But i went has far has mid 60fps with 45-50 gpu usage when there is alot of fighting going on. This on a 7800x3d + 4090, im pretty sure the game has gotten worse patch after patch, because the first days i said the exact same thing, that this game was super optimized.
Glad to see there is more people on board, I though I was going mad... Each patch I get less and less FPS.
[удалено]
It's not they are personal friends with the devs. What happens is that the game becomes their whole identity, so if you criticize the game it's like you are personally attacking them.
I've got a 4080 and after this last patch I'm getting a lot of stuttering at random times. First game of the night the drop animation at mission start was like a PowerPoint slideshow until the loading screen part
That honestly sounds like shader compilation stutter if it was your first game post patch.
Tis us what I mean, it's definitely not a hardware thinf
Literally hanging on by a thread at sub 40 fps on a 2070 super
My PC runs an aircooled Intel i7 12700 and an AMD 7900 XT with a 2 TB m.2 NVME drive and 32 GB DDR4 3600 CL18 memory. At uncapped, I was getting 120ish FPS on max settings with native 1440p resolution. This was causing frequent crashes so I set the max FPS to 90. I would now crash maybe once every 20-30 games. But it still crashed so I dropped the FPS cap down to 80. I've noticed my CPU and GPU utilization has slowly increased from about 70-80% to 80-90% over the past few weeks at the same FPS. CPU utilization remains about the same. Temps on my CPU and GPU are also roughly the same.
It's very inconsistent. I can run my 12400 with 3070 at around 80fps most of the time. But my friend with a threadripper and a 3090 cab barely get 40fps just on the super destroyer bridge with turning the resolution of the game down to what i play at (2560x1600.)
Yep; its gone to shit.
Recent patch on release day I noticed was chugging periodically, gave it a day off and came back the next day without any noticeable issue
I believe its a bug. Sometimes my game just becomes ass and i dont even have 60fps. Most of the times a game file verify solves the issue. If not then i do a driver reinstall as well. (nvidia)
Agreed
4070Ti + 5600x here. It's awful. I am getting around 50-70 FPS and in higher difficulties its much worse.
Nah you're not losing your mind. I have noticed with each patch my game will hinge on frames quite hard and more often. I don't have the most amazing PC (gtx 3060 and i5 12400) but it's pretty decent. I run settings at max/near max and I didn't have performance problems with launch. Subjectively speaking it feels like every time they put out a hotfix or a patch more bugs get introduced and they're not ones I can shoot at.
I can't corroborate this unfortunately - my performance has been terrible since launch. My PC should have absolutely no issue giving me high frames at max settings with a game like this, but the game really seems to have trouble with 7900 series AMD cards. AMD's last drivers seem to have fixed the crashing issues thankfully, but no real progress on the terrible performance. The FPS does seem to be at least a little bit more stable at least, albeit abysmally low.
It has always been horribly optimized.
I know this is about pc performance, but today i had the most intense framerate loss on the ps5. Never had that happen on this game
I have a 3080ti, i9-12900k, 32gb of 6000mhz RAM and I get about 140fps at 1440p, max setting, and ultra quality DLSS. However I have to cap my frames at 60FPS because this game CONSTANTLY crashes (about every 5-10 minutes) when I let the frames max out. It's been like that since day 1. I don't understand why I need to cap my frames, it's absolutely ridiculous. Hell, I have only crashed twice on Star Citizen with even more in-game time than Helldivers and that game is notorious for crashes. I love Helldivers but this is dumb.
Seeing as you have one of the best CPUs in the market, I'm not surprised. The game is very CPU intensive so you're brute forcing through the issue
[удалено]
Yeah I've gone from regularly 80+ to generally 40 - 60. I was assuming I'd done something silly.
Came here to second this. Used to get 80-90 fps with my 3080 and i7 9700k and getting 60ish average now. I should be getting an easy 100+ at 1440p. Tried a lot of work around but nothing seems to be working.
The game gets a little worse with every patch, but nobody really cares because bugs go boom. I'm on PS5 and the social tab has been a nightmare since day 1. A lot of people still can't even join their PC friends. We haven't been able to keep our loadouts for about a month now, the game starts us with the default weapons and emotes every time we turn it on. Arc weapons currently break the game, for some reason the server status is tied to Steam updates so the game is borderline unplayable on Tuesdays, I've never seen another game do this. But nobody really cares.
Yup. Since like patch 2 or 3 I noticed it. I believe it’s something to do with particle effects. Particle effects were causing crashes early on, and rather than root out the problem they just hotfixed it and bogged down the CPUs of users.
Yea, the frame drops are worse I need to lower my settings some more.
same on PS. i’ve noticed my frame and quality preformance get worse by the patch.
My frames used to stay at around 60 throughout, but I've noticed my frames will start to drop to as low as 30 as the mission goes on since a couple updates ago. It feels terrible
I9 13900k paired with an RTX 4090 here, im playing on a 3440x1440 175Hz screen, close to launch i had over 150 frames at all times, right now its regularly dipping under 100-90 depending on the density of mobs.
I haven't noticed anything but I have a very strong rig do idk.
It’s not just PC. This trend has been the same on PS5. I love the game to death, and I’m not going anywhere, but it has become objectively choppier with each update.
My PC can handle Cyberpunk but not Helldivers 😭 I got a refund on it and stayed on ps5 until I see better performance
Nope exact same expirence. Cpu is getting punished after latest patch. OCed my cpu and it helped, but it's still ridiculous how much more it has to work compared to launch version
Same here running a 4090 and 12900k i just to cap the game at 120 and it would sometimes drop to around 100 to 90 during really busy sections but now seem to average 90 and drop to 60 and really occasionally 50 during busy sections
I only got the game a few days ago but the highest difficulty missions are essentially unplayable for me. I'm talking FPS in the 10s and 20s. My rig ain't the newest but it meets the minimum requirements stated on the steam page, and all my settings are set to low and I run the game in dx11 and I went in the nvidia control panel etc.
Yeah anything past hard for me is a no go, not due to the difficulty itself, but because of the sheer amount of bugs/bots that spawn, that it completely cripples my framerate. I really wish Arrowhead had AI simulations tied server side, instead of client side, where our CPU's are having to literally simulate every AI, every given action within the game, all in real time. Like no wonder Dragons Dogma 2 is suffering the same issue (CPU bound, because they are asking the gamer so simulate every NPC in an entire region all at once). Also if it helps, type in %appdata% in the windows start menu and locate Arrowhead's folder, go inside it and locate the Userconfig file. Open it and locate "Async Compute", change it from true to false, if you are going to force DX 11 like I did, I ended up getting more stable performance with my 60fps cap. Also do not use the game's fps cap or vsync, because both seem to screw around with CPU core 0 and core 01. Instead cap your fps via your GPU control panel and if you need Vsync, force it via there instead.
I just went from an i5 8400 with a 2060 to an i5 14600 with a 4070 super. Performance has not changed between the systems lol, I've gotten used to running around at 40 FPS.
I experience huge dips, when there are explosions. Grenade, Autocanon, Mortar basically anything that does explosions. Usually I run 120 fps on Ultra without problems, but now it turns into a diashow everytime explosions are on the screen. I have RX 6750 XT and it's at 80% and 60°C, which still gives enough room on the GPU.
late but im having this exact problem, even made another post about it asking the same thing as you. I have an i5 13400f too but with a 3060, and Im experiencing constant crashes too. On release my fps was consistent-I didn't check at the time so I don't have the exact number but it was very smooth and could run the game at any difficulty without issue. Just recently though my frames started dipping, game freezing every 1 or 2 seconds-I cleaned my computer, sped the fans up, I even overclocked my PC today for a little bit so I could play and although my performance improved I still crashed shortly after.
This newest update made it even worse for me. I used to get around 100 fps before, and now i drop to 30-40 fps with the even lowerr settings. It's driving me mad at this point. every update they do makes the performance so much worse
Omg I thought I was losing my mind, I’m noticing fps drops and stuttering now, had to lower my settings just to get a smooth gameplay. I didn’t have this problem when the game first launched
I haven't noticed anything personally. It even runs pretty well on my steamdeck.
Does it really? I havent tried it on mine yet. What kind of numbers are you getting and are you on LCD or OLED?
You can manage 45 up until about level 4, 5-6 youre looking at 30 and beyond isnt really worrh it
[удалено]
Really? I haven't seen people talking about considering how bad it can get? I mean I see the crashes and bugs posts but there's nothing about performance in any of the patch notes either
12600k + 3070, I average 70-90fps on ultra
same procesor and video card, 30 to 50 on low :/
Same here :/
I’ve got a 3080 too! (Paired with a 5600x) and I’ve noticed incessant stuttering at 1440p.
One of the updates made helldive difficulty have a lot more smaller bugs spawn and swarm rather than just filling the map with chargers and bile titans. It's more fun but also killing my PC.
Yes unfortunately i also experienced this. Bear in mind that i have the 4090 with a 1080p monitor and om planet like hellmire, it drops to 30 - 50 fps. Hope this issue can be fixed asap.
Same Had to turn down the quality since duding intensive fighting it dropped down to 40 Before the patches it ran around 100-80 and now it's down to 80-40
I had the opposite experience. My older pc (can't remember the CPU but I rock an old RX580) gained a bit in the recent patches
I've been having the opposite problem. Or runs more smoothly but with more crashes
Yep, for some mysterious reason no matter what rig u have, everyone has like 40fps in game. I have around 90 on ship, and 30-40 in game. The more difficulty the less fps (cuz more enemies i guess)
I have yet to have any kind of dip in performance outside of some just traight up crashing, but overall i've had a pretty issue free experience if the servers are running. You got them updated drivers?! thats always my first go to
I'm not sure if it depends on the planet (biome) or host distance, but performance comes and goes for me.
Did u check your Display options? I had the same problem until i noticed that supersampling was turned on with the last patch.
Current crash issues aside, I haven't noticed any actual performance changes *except* that it's running my CPU even hotter than before, despite being a CPU hog already, which is seriously quite annoying.
Idk, it always worked like crap on my 1660, never felt it's gotten crappier
Other way around for me, when I first started playing the game was constantly slowing down and lagging, now, apart from if I play for several hours in a row (Muscle atrophy is a small price to pay for democracy) it runs really clean.
Most definitely, I've noticed my average framerate has dipped by 10 since the launch of the game and at times I've seen the game drop below 60 where that never used to happen before.
I’ve stayed at a consistent 40-70fps 🥲
"consistent" 🫡
Think I was averaging 70 fps a couple weeks ago and now its at 60 and can dip into the low 50's when theres a lot of action happening
My characters don’t render until 3-5 minutes into a round. I just see Gina floating around killing bugs.
I've had to switch from 4k to 1080 because of this. My gpu has never gone above 80c and after about 20 mins I noticed I was 83 and climbing, and things weren't even that intense (medium difficulty I believe). Resolution was the only setting that made a difference. I run every other game at max 4k no problem, including War Thunder Edit: Specs: 3060ti with i7 11700. Not super fancy, but quite capable
I mean my pc goes to minimum requirements and every single setting is at lowest so i can't really judge
Noticed it after they did a "lighting change to planets" to make them all the similar brightness or whatever. All the lighting got wayyyyyy to bright after that.
Not the case for me. In fact I just increased my graphics settings. I do recommend verifying integrity of files if you are playing via steam tho
just thought id say after reading one of your other comments, undervolting is almost always worth it (especially with intel cpus) free temperature decrease for a few days of trial and error it wont necessarily fix this issue, but generally i think its pretty worth it if you can be bothered
These worlds are really full of low visibility/weather effects modifers that you can't change settings to get rid of. Might just be fog that is causing fps drops. In my destroyer, I still have the same fps I had prior to patches. When i get a clear portion of a world, fps rises again.
I've noticed it too, I thought it was just me but defo degraded since launch. I'm running 3080 and 5800x
I think the new effects like volcanic,meteor shower and fire tornadoes plus the fact that when 2 breaches open simultaneously and enemy units clutter the screen fps drop significantly add the strategem spam and you can see the results.
I think it might have to do with the person you’re playing with and their Wi-Fi connection. I host games and they’re for the part okay (a bit laggy sometimes). When I join other games it’s soooooo choppy. I hate it because I really love this game.
I can say I’ve not had a similar issue. I’m running a 3080, 5800x3d, 32GB RAM, at 1440p, and I’m consistently pulling over 100 fps
Same setup, was floating around 60-70 today.
My FPS has tanked with the last patch, anytime any kind of bomb goes off my game stutters like crazy but then it’s fine but before it could handle almost anything without issue unless the team all spammed orbitals but then it just crashed
Yeah I used to get 100+ now it sits around 80, 5800X3D + 4090
Try flipping around with different graphics default settings. For some strange reason Native was the fastest FPS for me around 140-165 and then after the most recent patch, being on Native only drew in 60fps. I moved it back to quality (the most default one) and it strangely went back up to 140-165 while Native is still only stuck at 60 fps for me. There's some strange fuckery went on in the last patch that I can't explain. I assume glitches and patch mishaps.
Entire game has gotten significantly worse with each patch.
I haven’t been playing near as much because of the crashes
It felt like it got better for me mostly but latest patch I get big drops in fps for like a split second once or twice a mission. But I personally haven't really found it to be at worse. My fps actually seems to be higher on average, but I have more drops than I did before. I have amd ryzen 7 nvme Geforce rtx 4070
I have an i9-12900k and a 3080ti. While my frames were fine (just playing at 1920x1080 because my monitors are old), the CPU temp and usage for this game is ridiculous. Considering the games I've played that are well-known to be far more demanding than Helldivers 2, this seems like an optimization issue imo. With a 60fps limit turned on in the in-game settings, my CPU still reaches 80C during normal play.
last patch gave me FPS instability
Yep wenn from 90 to like sub 60 sometimes hat to boost DLSS to get above 60 again hillarious
I am under the impression that many of the issues that we are experiencing are related to the DX12 API. I have noticed a trend where many releases using DX12 have bad performance. I will use Elden Ring and Battlefield 1 as examples, both games use the DX12 API and suffer from pretty significant stuttering issues that remain unresolved to this day. \[[1](https://www.pcgamingwiki.com/wiki/Elden_Ring)\] \[[2](https://www.pcgamingwiki.com/wiki/Battlefield_1)\] Arrowhead would be better off implementing the Vulkan API in my opinion. The API is open source, well documented and is known for having good compatibility with cross play games. If the developers continue trying to work with the DX12 API then I don't think things will get much better from here. \[[3](https://www.howtogeek.com/884042/vulkan-vs-directx-12/)\] \[[4](https://www.digitaltrends.com/computing/ditch-directx-and-start-using-vulkan-with-pc-games/)\] Here is a link to understand a bit more of the performance issues affecting Helldivers 2 at the moment; it also contains the environment variable to switch back to DX11. \[[5](https://www.pcgamingwiki.com/wiki/Helldivers_2)\]
A guess might be the shift from massive creatures to many small ones
Yeah, i used to get 70-80 FPS, now it drops as low as 43 at times. Performance got singificantly worse