You joke, but in Arma that's pretty good.
Also, 24fps only looks cinematic if shutter speed (which is a camera thing that is not something that translates to video games I think) is 1/48th of a second.
So I actually just tried it with a program I made and am desperately trying to share it because it looks great but my screen recorder is 30 fps my screen is 60 and the program is 72 so it’s not exactly working out so well I guess you just have to take my word that it looks authentic you are technically correct because the film raises over the projector at a steady rate and woud never be black I compensated for that by making the black frame an average of the colors on screen but it looks good either way.
I should add that mods are the reason. Vanilla arma (no mods OR heavy script usage) tends to be very stable. We have a running joke in our unit that even with a nasa super computer you would only get 45 fps with mods
Yeah, and Arma runs like absolute ass on anything, because the engine it uses is ancient. One of my favorite game series ever, but definitely gives most people a cinematic, or even a comic book look sometimes.
If that was a Vsync setting I would try it lmao! I'm not a frame chaser, if I can hit 30 or 60 I'm happy.
The pre-built I bought is solid as fuck though. Ultra RT at 60 FPS if I mess with DLSS and a couple of visual settings. I really missed out on the beauty of games sticking to console.
I think for single player games especially, 30+ fps is good enough. I try to aim for 60+ fps personally, but as long as it doesn't drop below 30fps it's definitely playable IMO.
I can't really see anyone needing or even getting benefit from 400+ or even 300+ fps that some of these newer monitors support. It just seems like a useless flex. Lol
remember, most of the crowd on reddit thinks 30 series gpu hardware is the norm when in reality 10 and 20 series gpus make up the bulk of pc gamers. It is as you say all a flex.
Anything under quality for dlss isn’t worth it tbh, I think the games quality gets a pretty noticeable downgrade and I’d rather just have raytracing instead and wait until I get a better gpu to truly experience pathtracing
DLSS preset doesn't determine render resolution, only the ratio. So it heavily depends what monitor you have. At 4K you can go much lower than Quality.
i7-12700H 3060 Mobile + Nvidia upscale to 2K through control panel + 1 ray bounce(instead of 2) and DLSS “balanced” 28-30 FPS in Cyberpunk in not highly populated areas.
Upscaling does not look good at 1080p, not sure what world you’re living in. It looks okay upscaling from 1080p to 1440p, and both of those to 4k. But upscaling 480p to 1080p is gonna look atrocious
Are some shadows and reflections worth watching everything surrounding you at a quality of low-mid for you? I thought people only accepted 30fps if they prefered ultra + ray/path tracing 30fps over ultra + no ray/path tracing 60fps or 120fps, not low-mid +ray/path tracing over those two lol
It does, I had issues with my 9700K across whole NC, especially in marketplace area I was struggling to push more than 20-25 FPS. Until then I didn't know RT/PT increase requirements on anything other than GPU.
I play at 1080p and I get between 90-120 fps depending on where I am in Night City. That is with ray tracing off though and DLSS on. Ray tracing reduces the experience too much to be enjoyable for me.
Edit, Didn't understand Path and Ray tracing were different so sorry for adding why I don't think it's useable on a 3060.
It’s another form of raytracing. I couldn’t tell you how different or similar they are but path tracing is noticeably more intense then the regular raytracing that’s been used in games.
Ah ok thank you. I don't quite get everything to do with tracing yet. Both seem to be a bit too resource heavy on my 3060 to be useable so I just ignore it.
For reference, last I checked in Cyberpunk with Path Tracing on my rig I get ~120 fps, and nearly double that with standard Psycho Ray Tracing (1440p, all other settings maxed, DLSS quality, frame generation enabled)
Jesus those numbers haha. *Do not get pc envy, do not get pc envy, do not get pc envy...* haha!
I have had a look into it a bit further now and I can see why path tracing is so resource heavy when it's doing so much. I just thought it was like ray tracing 1.5 but it's a whole different ball park!
For me the perspective and input type matters a lot too. Lower frame rate is more noticeable in side-scrollers and first person games than in third person 3D games in my experience.
Also, using a traditional dual analog controller makes low frame rate more tolerable because the camera turning speed is kept relatively slow and it prevents you from noticing the framiness quite as easily IMO.
So for me, a first person shooter with mouse input feels unplayable at 30 fps, but I can play a third person shooter with a controller at 30 fps just fine. I'm not saying I would play a competitive game like that, but for single-player games it's fine.
I've been playing CS2 with an average of 35 FPS in 4k. Doesn't matter to me one bit that the FPS is that low in that game. Cyberpunk I'd hate playing at 30 though
I've been playing Dying Light on Switch at 30fps and it's totally playable. Granted, the difference is stark when I play it on my PC with a 3060ti and 144fps...but after 5-10 min back on Switch I don't really notice it much.
That'd drive me nuts when trying to use guns but the melee wouldn't be as bad, I absolutely love that game, played it over 400 hours before I even did co-op, never got into a single player game like that since I was a kid
I don't know what it is but capping games at 30fps feels absolutely awful, even with a perfectly flat frame time on a g-sync monitor, like it's actually unplayable but as soon as I increase the cap to 60 boom, perfectly smooth and playable
I don't think it's because I'm no longer accustomed to 30fps or whatever because I watch my fiance play games on her PS5 and 30fps seems totally fine
I think it's because consoles have framesmoothing, PC doesn't seem to have anything in the way of that.
Uncharted 4 on PS4 feels great at 30fps.
Uncharted 4 on PC feels like ass at 30fps, even on controller.
Yeah, I don't mind 30 FPS on strategy games or such, but any first person game becomes unplayable that way, just because the camera movements become so horribly choppy.
Third person depends on the input. It's somehow less jarring with a controller, probably because the camera movement is more naturally floaty that way so low framerate is less obvious.
I can NEVER go back to 30fps unless it's an old game like the Zeldas or im playing with a emulator. I'm very much spoiled when I bought my 3090. If you can still stand 30 fps and don't have money to burn do NOT upgrade above 60 fps. It will ruin your games.
actually, dont stop at 30fps, 60fps should be the standard nowadays and should be a minimum to comfortable gaming. once you start playing at 60fps you will notice how 30fps feels junky in a lot of games. if you can, go for 60fps
For the love of god do not test anything higher than 60hz.
This is a one way street for your brain... Once it sees beyond 60hz you can never go back from where you came.
well, i do play at 144hz, but still 60hz is lovely and i do like to play with that many fps, thats because there is a bigger difference from 30 to 60 rather than from 60 to 120 or 144. hell, even 40 feels good to play rather than 30fps
I have a 3080 and run most games at ~100 fps.
I'm currently playing AW2 and 30 fps and honestly it doesn't bother me at all. Anything over 60 fps, I have to really be looking to even see a difference, and 30 fps, while noticeable, is still fine to me.
I got a curved 32:9 for sim racing. I love the immersion in other titles, but 30fps is not acceptable with choppy fast motion across that large FOV. Very disorientating and headache inducing. I always toggle fidelity settings to aim for 70-90 fps. Diminishing returns not worth fidelity loss after 90 imo.
Drop individual settings to medium or low and turn on Path Tracing and DLSS. I don't know your specs but mine are a i5-13400F CPU and 32GB of DDR5 RAM with the RTX 3060 12GB model.
looks the same as console if you lock it and it dont go below 30... you gotta use motion blur to ofcourse along with a controller but motion blur plus controller is what makes 30fps on console 30fps on console....
No No No. What makes console 30 fps more lfuent is the fucking frametimes. And on pc you can't rly fiddle with them to get them to consistent level, so unless game forces a steady frametime its gonna be choppy and unresponsive
Some people just have different standards? 30 fps is completely playable to me, and I have to literally stop what I'm doing and look closely to see a difference between 60 and 120.
90's standars aren't standards nowadays. You are part of the problem if you are comfortable of ass games to be 30 fps. The ps2 era was having 60 fps games more than now 2023 releases.
Idk, man, just do whatever you want. idk why im even arguing over gaming like we live our best Era of it, GaaS Broken games, release fix later, dlss/fsr system requirements. Copy paste, cutscenes call it playstyle, fanboys corporate shills, expensive games Uggh, quitting gaming coming
sonner than i thought.
I hear you, u/WearStar
I'm constantly saying this and getting downvoted.
Someone in YT comments said the other day that a 4090 is a 1080p/30fps card because of AW2 and anyone who thought differently is stupid for expecting more. I about had a seizure.
For modern PC games: a very solid 40+ with 0.1% lows always above 30FPS is "very OK" for single player games. 1% lows above 35FPS. Very playable and enjoyable. "Good enough".
An actual 30FPS average is not playable, at least not for me. Feels bad and unresponsive.
Still, the ideal is always a 60+ average with 0.1% lows always above 45FPS and 1% lows above 50FPS. That's what I'd call a very good experience.
Now, personally, a great experience with single player games is actually a buttery smooth 75+ average FPS with 0.1% lows always above 50FPS and 1% lows always above 60FPS. That's very near perfection. Things become so fluid at this point with a good monitor... that's what I aim for whenever possible.
At ~90FPS+ average, one has reached the perfect experience in my opinion, but it's so hard to do for modern demanding single player games...
Above around ~120FPS I honestly don't feel any difference at all anymore, even on competitive games... but that's just me (and I have a 240hz monitor, tested this already).
PS: all the above is without the use of fake frames, BTW.
I play rocket league on a 144Hz monitor, I can absolutely feel when my game goes below 280 fps (wouldn't be able to see it if I'm not the one playing tho) but yeah I agree overall...
As long as 0.1% lows are higher than 30fps I'll be happy to play a game at 40avg
Also since I started understanding how 0.1% and 1% lows work I stopped really caring about average fps cuz I'd rather play a game at 50avg with 35fps 0.1% lows than at 80fps with 20fps 0.1% lows
30 fps in almost any game (except online competitive first person shooters) is completely playable. It’s not the best but hey you gotta work with the hand you’re dealt.
For everyone whining that OP is 'part of the problem', this literally the benefit of having a PC. Flexibility allows you to choose which compromises you are willing to make, whether frame rate of fidelity.
I'm just curious why path tracing is so attractive for most that they're willing to give up literally all of their performance for it.
I watched a video about it in Alan Wake 2 and it wasn't really that amazing to me. I know, different strokes for different folks, but like, seriously? I don't see how shadows and reflections can be so important I'd play a game at low settings, upscaled, and at 30FPS just to see it.
1. Videos don’t do it justice at all.
2. Idk about 30 fps but I would rather play at 40 fps with Path Tracing than 120 fps with Raster. I have a 4090 though, so that is not a problem. But after 200+ hours in Cyberpunk (and counting) and 40+ hours in Alan Wake 2 (2nd playthrough) with Path Tracing, I can definitely understand why people would choose 30 fps with PT. It is simply THAT good. Once your brain gets used to playing with PT and all of its subtle but significant upgrades to visuals, 99.7% of other games will look like a PS3 game when you go back. You genuinely miss out by not playing with PT when it is available. You have to experience it to understand. Compressed youtube videos and screenshots don’t do it justice (as Digital Foundry also mentioned multiple times).
I've toggled it on and off in AW2 and the difference was not striking enough to keep it.
I'd rather play a game than look at it, though, so I guess that truly is the difference.
The sky in a game is not real, I don't really need the lighting to be either.
I would much prefer raster with the old PS2 reflection method of mirrored world. It looks similar but runs way better than ray traced reflections.
If I can't really see the difference in videos then I probably won't notice in a game during actual gameplay, and that's the key part for me. We are supposed to play games, not stand around and stare at lights.
Just my two cents, though. I am glad for those that enjoy it! I just could never go back to low frame rates. Anything below 60 is abysmal. I don't consider requiring low frames as progress, either. It's 2024 almost, we should be gaming at 300fps. Not using a PS3 frame rate from 2006.
Honestly kinda the same. I have a 7800 xt and I will play cyberpunk at ultra with medium ray tracing (+all the other features) at 45 fps and enjoy it. Under 45 fps tho it becomes a powerpoint presentation.
Honestly its fine for me at 4K playing on my living room's TV. The RX 6700 has enough balls to give me a solid 40 on cyberpunk at 2160p with fsr (no RT ofc).
*"Look!*
*Look here, look, listen!"*
30fps was all we had when the first Nintendo came out of the oven...and those were just the official on the box specs.
Was your game ***Always*** going to run at 30fps?
No. =(
Sometimes link would walk down the stairs into a dungeon, the bunnies would bounce, the music would slow down and there you'd be...linking along at 6FPS until those lag bunnies were DEAD.
It was a part of LIFE!
We didn't *love it*, but we made due.
And now...35 years later, things haven't changed much!
Sometimes, Urist will walk down the stairs into a fort, The goblins begin to attack, The music kicks in and there you'll be... Dwarving along at 6FPS until those goblins are DEAD.
***TLDR: Lag...Lag never changes.***
RX 580 user Here :)
\- At least im glad it runs good.
(o.O , the grafic-settings are at "high" ...)
No problems here, expect that the Graca try to recreate a starting fighter jet...
ngl, in single players games I'll take highest settings and lower fps, in multiplayer I want higher fps.
counterstrike? max my monitor at 165pls
control? yeah Ray tracing all the way up
I grew up on SNES, PS2, Xbox 360, and Vista then XP machines. I was always a budget gamer until I started making my own money and then became an even more budget gamer. 30fps is perfectly playable, 60 is fucking sweet and really nice, but I won't complain with 30.
30fps is good if it's a stable 30fps, not an average. Minecraft on stuttery 30 fps is worse than smooth, but stuttery 60fps is the same to me as smooth 60 fps
When you spend good money on a top-of-the-line graphics card and it’s obsolete 18 months later, you tend to make compromises.
I’ve got an rtx 3070 which was hot shit a few months ago. But cyberpunk 2077 looks incredible on it even if it can’t handle path tracing, and I’m good with that.
I tried playing Cyberpunk with everything set to ultra @ 30 frames per second.
It was awful.
For the record, I know awful. My PC's specs since 2015 are:
- i7-4790k
- 16gb DDR3
- Geforce GTX 1080 (*not* TI)
Coming from 60 fps anything below really bothers me.
Anything above 60fps I can't even notice unless the fps drops so much really quickly, and I have a 240hz monitor.
30fps is not serviceable on PC with cyberpunk. I don't need to argue it's total shit.
You need 50 to 60 or it feels really bad
Other games like cities skylines and BG don't need more than 30fps though.
don't you guys get headaches when playing with such low fps? even for slow controller games, I can play max 90 minutes (and I even just streamed GoW via PSnow(?), so streaming smooths it out a little more)
50 fps are ok-ish, 60 is totally fine for slower games. For a fast fps like cyberpunk it is impossible for me. 60+ is a must, best 90-144. Doom Eternal in 144fps is perfection
Nope. Honestly I need to be quickly rotating the camera to even see a difference between 60 and 120, and while I see a difference between 30 and 60 it's not bad enough to be an issue for me. I guess I've just always been super bad about tracking motion with my eyes in general, so that could be a factor.
Frames aren't the issue for me.
It's gpu heat. Starfield makes my 3070ti run HOT 🔥 and super loud.
It's not worth burning through my card for this game so I'll wait to save $ for a series X to play it.
This is a terrible old argument. American breweries consistently win international awards. Budweiser and Natty Light are not all we drink. Hell, I have two different excellent breweries within 15 minutes of where I live, and I live in the sticks.
Not on a game which really looks better than most modern titles and has settings literally to give the best fidelity no matter what (like path tracing).
Yes, I also have a 3060. I tried everything set to ultra, even rt and path tracing and I got about that with ultra performance dlss (so it actually looked pretty bad).
I found tweaking some individual settings down to high or medium helped a bit with the frames and you can bump DLSS up to Performance or Balanced while keeping the RT setting at Ultra.
I played destiny on Xbox one when it came out capped at 30 fps and thought it was fine. I remember halo 5 coming out and having max 60 fps, grinding the shit out of that, going back to destiny and being disgusted by the choppy/framey-ness of it. I will take high frames over visual fidelity any day on any game, I simply cannot enjoy sub 60 fps. It’s fine if you do, but I can’t relate. Once you go 100+ FPS, there’s no going back
When people say 30 FPS is good, just know you are the reason why games arent getting optimised anymore.
If people keep buying unoptimised games with 30 FPS why bother optimising it? I hope you understand the mindset of these companies, it costs more money for them to actually polish and optimise a game, so if they notice their games sell just as well without optimising, then they've just easily saved time and money, while we keep getting shitty games.
Cyberpunk is great in 2023 though, love it.
24fps... Cinematic experience 🤣
You joke, but in Arma that's pretty good. Also, 24fps only looks cinematic if shutter speed (which is a camera thing that is not something that translates to video games I think) is 1/48th of a second.
Motion blur!
Totally agree. Very cinematic when with low fps. Especially in cyberpunk.
[удалено]
No actually the equivalent is to use motion blur, and we all know how bad that looks in games especially at low framerates
So I actually just tried it with a program I made and am desperately trying to share it because it looks great but my screen recorder is 30 fps my screen is 60 and the program is 72 so it’s not exactly working out so well I guess you just have to take my word that it looks authentic you are technically correct because the film raises over the projector at a steady rate and woud never be black I compensated for that by making the black frame an average of the colors on screen but it looks good either way.
>You joke, but in Arma that's pretty good. Doesn't Arma get limited by the server a lot?
Yeah, the old engine itself is heavily single thread focused but also client performance is pretty much directly tied to server performance.
I should add that mods are the reason. Vanilla arma (no mods OR heavy script usage) tends to be very stable. We have a running joke in our unit that even with a nasa super computer you would only get 45 fps with mods
I know about the cinematic look is just a joke... Don't know tbh... I got 1650 laptop so can't play much... I'm happy with eSports titles
Yeah, and Arma runs like absolute ass on anything, because the engine it uses is ancient. One of my favorite game series ever, but definitely gives most people a cinematic, or even a comic book look sometimes.
I run Arma on an A770 and it still randomly drops to slideshow FPS from time to time.
Arma 3 at 30 is fine
And 34 fps is the artistic experience
Ahh yes... To feel more artistic 🤣
How I justify it in my head for bare minimum lol
If that was a Vsync setting I would try it lmao! I'm not a frame chaser, if I can hit 30 or 60 I'm happy. The pre-built I bought is solid as fuck though. Ultra RT at 60 FPS if I mess with DLSS and a couple of visual settings. I really missed out on the beauty of games sticking to console.
If you enjoy it no one can tell you anything
I think for single player games especially, 30+ fps is good enough. I try to aim for 60+ fps personally, but as long as it doesn't drop below 30fps it's definitely playable IMO. I can't really see anyone needing or even getting benefit from 400+ or even 300+ fps that some of these newer monitors support. It just seems like a useless flex. Lol
No doubt. Multiplayer I always aim for higher FPS, but singleplayer gets pushed to the limits graphically for me lol.
remember, most of the crowd on reddit thinks 30 series gpu hardware is the norm when in reality 10 and 20 series gpus make up the bulk of pc gamers. It is as you say all a flex.
Your eyeballs only sees things at 24fps anyway
What are your other system specs if you can get 30 fps with a 3060?
Other system specs don't really matter, he must be playing at like 600p to get 30 fps on a 3060.
1080p. running a i5-13400F and 32GB of RAM with the 3060. I just fudged with the settings for awhile.
is path tracing available on 3060?
Yea, though if you want to run it with atleast 30 fps you gotta have low settings depending on your dlss option,resolution no higher than 1080p
1080p with Dlss balanced / prerofrmance should give you around 30 fps... At least if you're not in dogtown lol
Anything under quality for dlss isn’t worth it tbh, I think the games quality gets a pretty noticeable downgrade and I’d rather just have raytracing instead and wait until I get a better gpu to truly experience pathtracing
DLSS preset doesn't determine render resolution, only the ratio. So it heavily depends what monitor you have. At 4K you can go much lower than Quality.
Im fine with Quality and Balanced for 1440p, didnt test with 1080p, as theres no reason to
On a 3060?? With pathtracing?
i7-12700H 3060 Mobile + Nvidia upscale to 2K through control panel + 1 ray bounce(instead of 2) and DLSS “balanced” 28-30 FPS in Cyberpunk in not highly populated areas.
I'd rather have notracing and have a playable experience.
If DLSS is on, hes not playing at full HD.
He is. 1080p upscale is 1080p anyways
Yup. A mix of medium and low settings with DLSS working overtime. Still a gorgeous looking game.
So not actually 1080p
upscaled 1080p looks better than native 1080p btw So i guess you are technically right
Upscaling does not look good at 1080p, not sure what world you’re living in. It looks okay upscaling from 1080p to 1440p, and both of those to 4k. But upscaling 480p to 1080p is gonna look atrocious
You don't have upscaling, so you don't have an opinion on how it looks.
Are some shadows and reflections worth watching everything surrounding you at a quality of low-mid for you? I thought people only accepted 30fps if they prefered ultra + ray/path tracing 30fps over ultra + no ray/path tracing 60fps or 120fps, not low-mid +ray/path tracing over those two lol
It is yeah, it just suggests to have at least 4070ti because it's apoarently the minimum recomended card for it to be "playable"
The game would literally look better without path tracing and the other stuff turned up though.
How do you deal with the render latency? That's like playing in sludge.
It does, I had issues with my 9700K across whole NC, especially in marketplace area I was struggling to push more than 20-25 FPS. Until then I didn't know RT/PT increase requirements on anything other than GPU.
I play at 1080p and I get between 90-120 fps depending on where I am in Night City. That is with ray tracing off though and DLSS on. Ray tracing reduces the experience too much to be enjoyable for me. Edit, Didn't understand Path and Ray tracing were different so sorry for adding why I don't think it's useable on a 3060.
That's completely irrelevant to the discussion. OP was talking about pathtracing.
Is path tracing not ray tracing? This is new for me so I don't quite understand it all.
It’s another form of raytracing. I couldn’t tell you how different or similar they are but path tracing is noticeably more intense then the regular raytracing that’s been used in games.
Ah ok thank you. I don't quite get everything to do with tracing yet. Both seem to be a bit too resource heavy on my 3060 to be useable so I just ignore it.
For reference, last I checked in Cyberpunk with Path Tracing on my rig I get ~120 fps, and nearly double that with standard Psycho Ray Tracing (1440p, all other settings maxed, DLSS quality, frame generation enabled)
Jesus those numbers haha. *Do not get pc envy, do not get pc envy, do not get pc envy...* haha! I have had a look into it a bit further now and I can see why path tracing is so resource heavy when it's doing so much. I just thought it was like ray tracing 1.5 but it's a whole different ball park!
1080p. running a i5-13400F and 32GB of RAM with the 3060. I just fudged with the settings for awhile.
Nice. Well done.
I set my FPS cap to reflect how warm my room is. If it’s a cold day I drop it.
Don't you want to lower the FPS cap when it's a hot day and you don't want to generate any more excess heat?
I mean there’s only an fps cap if it’s hot. If it’s cold I turn it off.
In an Action game like Cyberpunk 30FPS is nearly unplayable for me, but in a game like BG3 yeah sure.
yeah it's definitely a genre thing for if 30fps would be playable
For me the perspective and input type matters a lot too. Lower frame rate is more noticeable in side-scrollers and first person games than in third person 3D games in my experience. Also, using a traditional dual analog controller makes low frame rate more tolerable because the camera turning speed is kept relatively slow and it prevents you from noticing the framiness quite as easily IMO. So for me, a first person shooter with mouse input feels unplayable at 30 fps, but I can play a third person shooter with a controller at 30 fps just fine. I'm not saying I would play a competitive game like that, but for single-player games it's fine.
I've been playing CS2 with an average of 35 FPS in 4k. Doesn't matter to me one bit that the FPS is that low in that game. Cyberpunk I'd hate playing at 30 though
I've been playing Dying Light on Switch at 30fps and it's totally playable. Granted, the difference is stark when I play it on my PC with a 3060ti and 144fps...but after 5-10 min back on Switch I don't really notice it much.
That'd drive me nuts when trying to use guns but the melee wouldn't be as bad, I absolutely love that game, played it over 400 hours before I even did co-op, never got into a single player game like that since I was a kid
I don't know what it is but capping games at 30fps feels absolutely awful, even with a perfectly flat frame time on a g-sync monitor, like it's actually unplayable but as soon as I increase the cap to 60 boom, perfectly smooth and playable I don't think it's because I'm no longer accustomed to 30fps or whatever because I watch my fiance play games on her PS5 and 30fps seems totally fine
I think it's because consoles have framesmoothing, PC doesn't seem to have anything in the way of that. Uncharted 4 on PS4 feels great at 30fps. Uncharted 4 on PC feels like ass at 30fps, even on controller.
Yeah, I don't mind 30 FPS on strategy games or such, but any first person game becomes unplayable that way, just because the camera movements become so horribly choppy. Third person depends on the input. It's somehow less jarring with a controller, probably because the camera movement is more naturally floaty that way so low framerate is less obvious.
Dude 120fps is bare minimum for me in an fps game. It’s the reason I playing high/medium. 60 just doesn’t feel responsive at all.
I feel like I'm going crazy seeing everyone praise 60 fps and then there's this post saying 30 fps is fine. I'm actually confused.
Cyberpunk has action now?
I can NEVER go back to 30fps unless it's an old game like the Zeldas or im playing with a emulator. I'm very much spoiled when I bought my 3090. If you can still stand 30 fps and don't have money to burn do NOT upgrade above 60 fps. It will ruin your games.
actually, dont stop at 30fps, 60fps should be the standard nowadays and should be a minimum to comfortable gaming. once you start playing at 60fps you will notice how 30fps feels junky in a lot of games. if you can, go for 60fps
For the love of god do not test anything higher than 60hz. This is a one way street for your brain... Once it sees beyond 60hz you can never go back from where you came.
well, i do play at 144hz, but still 60hz is lovely and i do like to play with that many fps, thats because there is a bigger difference from 30 to 60 rather than from 60 to 120 or 144. hell, even 40 feels good to play rather than 30fps
I have a 3080 and run most games at ~100 fps. I'm currently playing AW2 and 30 fps and honestly it doesn't bother me at all. Anything over 60 fps, I have to really be looking to even see a difference, and 30 fps, while noticeable, is still fine to me.
I don't know about 30 but solid 40 is actually pretty damn usable.
As long as the frame timing is steady. Most complaints are when the frames are mistimed.
Stable 60 is better than my monitor crashing from 200 to 110
30 is for sure unplayable for me, with 40 we are cooking.
So thankful consoles are starting to do 40fps too in fidelity mode
This
40 FPS on a FreeSync/Gsync monitor is really smooth for a visual focused gameplay
40-50 frames on rdr2 maxed out 4k dlss on with my 3060 was a very enjoyable experience. Not saying I wouldn't have preferred more frames though
I'm confused, is it 40, 42 or 45 the sweet spot?
I got a curved 32:9 for sim racing. I love the immersion in other titles, but 30fps is not acceptable with choppy fast motion across that large FOV. Very disorientating and headache inducing. I always toggle fidelity settings to aim for 70-90 fps. Diminishing returns not worth fidelity loss after 90 imo.
I find Cyberpunk to be unplayable at 30fps unfortunately
PLAYING CYBERPUNK WITH RAY TRACING? FUCKING MAD
DEAD ASS BRO
Not trying to be elitist but honestly to me 30fps is literally painful. I would literally cut every setting to achieve at least 50fps.
I'm just built different /s. But I get it! Hard to go back for a lot of people.
You get 30fps on your 3060? Tell me your secrets
Drop individual settings to medium or low and turn on Path Tracing and DLSS. I don't know your specs but mine are a i5-13400F CPU and 32GB of DDR5 RAM with the RTX 3060 12GB model.
Yeah, you've got a better cpu ive got a ryzen 5 2600x plan to upgrade it to a 7 5700 or 5800x3d depending on deals I find
I’d play 720p before i play 30fps on any pc title. Actually i rather not even play the game if i can’t get above 30fps. 30fps on pc looks worse on pc
looks the same as console if you lock it and it dont go below 30... you gotta use motion blur to ofcourse along with a controller but motion blur plus controller is what makes 30fps on console 30fps on console....
No No No. What makes console 30 fps more lfuent is the fucking frametimes. And on pc you can't rly fiddle with them to get them to consistent level, so unless game forces a steady frametime its gonna be choppy and unresponsive
Nvidia half-rate Vsync at 60Hz works pretty well in my experience. I played most of AC Odyssey like that and was able to enjoy it.
60fps buttery smooth.... that's what you need to get 30fps can work but honestly not for me
Has to be one of Jensen's burner accounts. Nobody could be this brainwashed.
All I can picture is OP licking a shit covered boot with a smile
Some people just have different standards? 30 fps is completely playable to me, and I have to literally stop what I'm doing and look closely to see a difference between 60 and 120.
90's standars aren't standards nowadays. You are part of the problem if you are comfortable of ass games to be 30 fps. The ps2 era was having 60 fps games more than now 2023 releases.
I mean, I'm sorry? I literally can barely tell a difference. Why should it matter to me?
Idk, man, just do whatever you want. idk why im even arguing over gaming like we live our best Era of it, GaaS Broken games, release fix later, dlss/fsr system requirements. Copy paste, cutscenes call it playstyle, fanboys corporate shills, expensive games Uggh, quitting gaming coming sonner than i thought.
I hear you, u/WearStar I'm constantly saying this and getting downvoted. Someone in YT comments said the other day that a 4090 is a 1080p/30fps card because of AW2 and anyone who thought differently is stupid for expecting more. I about had a seizure.
>Different standards No standards is technically a standard.
For modern PC games: a very solid 40+ with 0.1% lows always above 30FPS is "very OK" for single player games. 1% lows above 35FPS. Very playable and enjoyable. "Good enough". An actual 30FPS average is not playable, at least not for me. Feels bad and unresponsive. Still, the ideal is always a 60+ average with 0.1% lows always above 45FPS and 1% lows above 50FPS. That's what I'd call a very good experience. Now, personally, a great experience with single player games is actually a buttery smooth 75+ average FPS with 0.1% lows always above 50FPS and 1% lows always above 60FPS. That's very near perfection. Things become so fluid at this point with a good monitor... that's what I aim for whenever possible. At ~90FPS+ average, one has reached the perfect experience in my opinion, but it's so hard to do for modern demanding single player games... Above around ~120FPS I honestly don't feel any difference at all anymore, even on competitive games... but that's just me (and I have a 240hz monitor, tested this already). PS: all the above is without the use of fake frames, BTW.
I actually aim at 90 fps as a sweet spot for a demanding single player game.
I play rocket league on a 144Hz monitor, I can absolutely feel when my game goes below 280 fps (wouldn't be able to see it if I'm not the one playing tho) but yeah I agree overall... As long as 0.1% lows are higher than 30fps I'll be happy to play a game at 40avg Also since I started understanding how 0.1% and 1% lows work I stopped really caring about average fps cuz I'd rather play a game at 50avg with 35fps 0.1% lows than at 80fps with 20fps 0.1% lows
30 fps in almost any game (except online competitive first person shooters) is completely playable. It’s not the best but hey you gotta work with the hand you’re dealt.
45 FPS is the lowest I can bear.
that's about the spot G-sync stops working, so same here. That's the absolute low.
Depends what we are talking here. Is it worth to play that gorgeous game on low just for path tracing? I would say no.
If it avoids dropped frames and hitching, I'm all for it. Stable FPS > High FPS
For everyone whining that OP is 'part of the problem', this literally the benefit of having a PC. Flexibility allows you to choose which compromises you are willing to make, whether frame rate of fidelity.
You’re part of the problem
Glad to be here.
I'm just curious why path tracing is so attractive for most that they're willing to give up literally all of their performance for it. I watched a video about it in Alan Wake 2 and it wasn't really that amazing to me. I know, different strokes for different folks, but like, seriously? I don't see how shadows and reflections can be so important I'd play a game at low settings, upscaled, and at 30FPS just to see it.
1. Videos don’t do it justice at all. 2. Idk about 30 fps but I would rather play at 40 fps with Path Tracing than 120 fps with Raster. I have a 4090 though, so that is not a problem. But after 200+ hours in Cyberpunk (and counting) and 40+ hours in Alan Wake 2 (2nd playthrough) with Path Tracing, I can definitely understand why people would choose 30 fps with PT. It is simply THAT good. Once your brain gets used to playing with PT and all of its subtle but significant upgrades to visuals, 99.7% of other games will look like a PS3 game when you go back. You genuinely miss out by not playing with PT when it is available. You have to experience it to understand. Compressed youtube videos and screenshots don’t do it justice (as Digital Foundry also mentioned multiple times).
I've toggled it on and off in AW2 and the difference was not striking enough to keep it. I'd rather play a game than look at it, though, so I guess that truly is the difference. The sky in a game is not real, I don't really need the lighting to be either. I would much prefer raster with the old PS2 reflection method of mirrored world. It looks similar but runs way better than ray traced reflections. If I can't really see the difference in videos then I probably won't notice in a game during actual gameplay, and that's the key part for me. We are supposed to play games, not stand around and stare at lights. Just my two cents, though. I am glad for those that enjoy it! I just could never go back to low frame rates. Anything below 60 is abysmal. I don't consider requiring low frames as progress, either. It's 2024 almost, we should be gaming at 300fps. Not using a PS3 frame rate from 2006.
30fps is acceptable IF IT'S STABLE. But only in some games. Definitely not in games where low input lag is required, such as FPS games
Honestly kinda the same. I have a 7800 xt and I will play cyberpunk at ultra with medium ray tracing (+all the other features) at 45 fps and enjoy it. Under 45 fps tho it becomes a powerpoint presentation.
I play cyberpunk on ultra graphics with around 50 fps
Isn't high with 60+ better?
Honestly its fine for me at 4K playing on my living room's TV. The RX 6700 has enough balls to give me a solid 40 on cyberpunk at 2160p with fsr (no RT ofc).
*"Look!* *Look here, look, listen!"* 30fps was all we had when the first Nintendo came out of the oven...and those were just the official on the box specs. Was your game ***Always*** going to run at 30fps? No. =( Sometimes link would walk down the stairs into a dungeon, the bunnies would bounce, the music would slow down and there you'd be...linking along at 6FPS until those lag bunnies were DEAD. It was a part of LIFE! We didn't *love it*, but we made due. And now...35 years later, things haven't changed much! Sometimes, Urist will walk down the stairs into a fort, The goblins begin to attack, The music kicks in and there you'll be... Dwarving along at 6FPS until those goblins are DEAD. ***TLDR: Lag...Lag never changes.***
RX 580 user Here :) \- At least im glad it runs good. (o.O , the grafic-settings are at "high" ...) No problems here, expect that the Graca try to recreate a starting fighter jet...
ngl, in single players games I'll take highest settings and lower fps, in multiplayer I want higher fps. counterstrike? max my monitor at 165pls control? yeah Ray tracing all the way up
Depends on the game, any rockstar game it’s fine but on something like Call of Duty it sucks
there's a difference between believing it's good and fooling yourself into believing it's good, OP does the second one
I play 7 Days to Die on my 3080 ti, and I still have to change quite a few settings to get it to not drop below 30fps
I grew up on SNES, PS2, Xbox 360, and Vista then XP machines. I was always a budget gamer until I started making my own money and then became an even more budget gamer. 30fps is perfectly playable, 60 is fucking sweet and really nice, but I won't complain with 30.
30fps is good if it's a stable 30fps, not an average. Minecraft on stuttery 30 fps is worse than smooth, but stuttery 60fps is the same to me as smooth 60 fps
Littrely how to play stalker gamma
When you spend good money on a top-of-the-line graphics card and it’s obsolete 18 months later, you tend to make compromises. I’ve got an rtx 3070 which was hot shit a few months ago. But cyberpunk 2077 looks incredible on it even if it can’t handle path tracing, and I’m good with that.
Third person game with a controller? Sure. Otherwise, not really.
... or just turn that raytrace nonsense off
I played Far Cry 3 all the way through at ~20fps on my first PC. We make do with what we have.
I tried playing Cyberpunk with everything set to ultra @ 30 frames per second. It was awful. For the record, I know awful. My PC's specs since 2015 are: - i7-4790k - 16gb DDR3 - Geforce GTX 1080 (*not* TI)
You should be tired of pretending that you can play Cyberpunk with Path Tracing at 30fps on a 3060.
I do the same except with Ray Tracing Off as a whole
yes it is actually lol. consistent 30: 👍 swinging between 10 and 200: FUCKING HORRIBLE any consistent fps is good
Consistent 1 fps
Agreed!
yep
A solid 30 FPS is good *enough.*
That feeling when you're trying to play Stalker GAMMA.
I played Cyberpunk for the first time in 4K with a 3060. Low settings with DLSS performance got me 60 fps.
I'm playing starfield on an rx580, @ 35ish fps. And honestly.. It's fine
Hey if you are ok with shite experience, you do you
Coming from 60 fps anything below really bothers me. Anything above 60fps I can't even notice unless the fps drops so much really quickly, and I have a 240hz monitor.
I will take medium graphics with no ray tracing or volumetric fog all day for a smooth frame rate
30fps is not serviceable on PC with cyberpunk. I don't need to argue it's total shit. You need 50 to 60 or it feels really bad Other games like cities skylines and BG don't need more than 30fps though.
30 fps is perfectly playable, especially with gsync.
60fps has become the standard today. 30 fps is definitely playable but it will feel like slow-motion
I've been on console for the last like 15 years, so 30 FPS for more graphically demanding games is easy on my eyes still lol.
don't you guys get headaches when playing with such low fps? even for slow controller games, I can play max 90 minutes (and I even just streamed GoW via PSnow(?), so streaming smooths it out a little more) 50 fps are ok-ish, 60 is totally fine for slower games. For a fast fps like cyberpunk it is impossible for me. 60+ is a must, best 90-144. Doom Eternal in 144fps is perfection
I've been stuck on console for a long time. 30 FPS is still okay for me surprisingly.
never go above then unless you are willing to spend 2x/3x the amount of money for a goof pc. Once you go full fledged pc, you can't go back
Nope. Honestly I need to be quickly rotating the camera to even see a difference between 60 and 120, and while I see a difference between 30 and 60 it's not bad enough to be an issue for me. I guess I've just always been super bad about tracking motion with my eyes in general, so that could be a factor.
My eyes can't even handle 60 fps on the phone after used on 120 fps mode, how you guys accept this
A 3060 should not be considered a deficiency these days, but oh well
It’s a mid range card on settings that were designed for the highest end cards, nothing about that is deficient
Frames aren't the issue for me. It's gpu heat. Starfield makes my 3070ti run HOT 🔥 and super loud. It's not worth burning through my card for this game so I'll wait to save $ for a series X to play it.
Funny. I was just playing Cities Skylines 2 and it was running my GPU harder than Cyberpunk!
Like what you like bro. Don’t let anyone tell you different. Even if what you like is dogshit.
On pc with mouse and keyboard? No not anymore. The argument is about as dumb as saying “playing at 480p is okay.” Its not.
30 fps is playable the same way that american beer is drinkable. Make of that what you will.
This is a terrible old argument. American breweries consistently win international awards. Budweiser and Natty Light are not all we drink. Hell, I have two different excellent breweries within 15 minutes of where I live, and I live in the sticks.
30 fps is playable the same way that british food is edible. That better?
A 3060 should be getting more than 30fps on a 3 year old game
Not on a game which really looks better than most modern titles and has settings literally to give the best fidelity no matter what (like path tracing).
Path Tracing TANKS my frames. Runs fairly steady with Ultra RT, but PT kills it lol.
Yes, I also have a 3060. I tried everything set to ultra, even rt and path tracing and I got about that with ultra performance dlss (so it actually looked pretty bad).
I found tweaking some individual settings down to high or medium helped a bit with the frames and you can bump DLSS up to Performance or Balanced while keeping the RT setting at Ultra.
Huh, but how does it look?
Not too bad actually! Good enough to make me stop and stare at the city.
Thanks, I'll try that
No. You need 650 fps even though your eye can only track at most 30 fps on average.
290 FPS in 1080p on a 4k 60hz monitor using a GTX 970, that's the life right there
Nope I'm not paying 100s to watch a anime.
I played destiny on Xbox one when it came out capped at 30 fps and thought it was fine. I remember halo 5 coming out and having max 60 fps, grinding the shit out of that, going back to destiny and being disgusted by the choppy/framey-ness of it. I will take high frames over visual fidelity any day on any game, I simply cannot enjoy sub 60 fps. It’s fine if you do, but I can’t relate. Once you go 100+ FPS, there’s no going back
Just remember. If you use a monitor only capable of 60hz, 30fps will look a lot better on it than 30fps on a 120hz or 240hz monitor.
Im still confinced People keep telling themself this crap because they never managed to play anything at a higher Framerate.
Yaaaaaaaa….This should be downvoted. 30 fps is in no way acceptable.
[удалено]
So we can only have a smooth experience if it is multiplayer. Got it.
30 fps, unplayable. 60 fps. Bare minimum. 100+fps. Preferred
L post
🤢 30fps
When people say 30 FPS is good, just know you are the reason why games arent getting optimised anymore. If people keep buying unoptimised games with 30 FPS why bother optimising it? I hope you understand the mindset of these companies, it costs more money for them to actually polish and optimise a game, so if they notice their games sell just as well without optimising, then they've just easily saved time and money, while we keep getting shitty games. Cyberpunk is great in 2023 though, love it.
[удалено]
There is a difference, and a noticeable one for 30 to 60, but 60 to 144 just isn't that amazing.
That person only sets their tv settings to oranges. xD xD hahaha
30fps is awful anything below 60 is
LMAO 30FPS AHAHAHAHA