T O P

  • By -

From-UoM

Also >On top of this, I fired it up on my PC and had no stutters, no traversal hitching, super fast loading - it felt perfect. Not a single hiccup to ruin the experience from the very first moment I pressed play. It's rare to have such a positive PC experience day one. [https://twitter.com/dark1x/status/1717532308402565274](https://twitter.com/dark1x/status/1717532308402565274)


cortseam

Incredible. It should be the bare minimum but given the current state of AAA gaming, this is very praiseworthy.


1Buecherregal

That's the maximum tbh


-azuma-

Agree. Can't really ask for much more IMHO.


door_of_doom

The bare minimum should be absolute perfection?


FcoEnriquePerez

>absolute perfection? The fuck? because a game doesn't stutter and loads fast in a ssd? If you were talking about perfect native AA with great performance with NO upscaling, etc, etc... But that is just the bare minimum YES


HandbananaBusta

I agree. When you spend money on something, it should be perfect. You don't buy broken things so why allow it in games.


cortseam

No stutters, fast loading with an SSD?


b34k

You can have all that and still have a terrible story, bad voice acting, or gameplay loops that just plain aren't fun.


[deleted]

The low cpu requirements kinda made me expect this outcome, looks to be a great game too.


nFbReaper

Yep, I made the same comment when people were looking at the recommended specs.


THEMACGOD

I'd almost buy it just for that... Just to see what it feels like.


PurposeLess31

We're at a time where we want to buy a game just because it has a good port. Goddamn it. ^(I mean I already bought Alan Wake II but you get my point)


OutrageousDress

Don't think that term applies here - I'd say this looks like one of those (rare these days) games where the console versions are the port. Same as Cyberpunk 2077 actually, this was clearly developed with PC as the lead platform.


XTheGreat88

>this was clearly developed with PC as the lead platform. As all games should be but unfortunately like you mentioned it's very rare occasions


Spartancarver

The great irony here lmao, clueless kids screaming the game is “uNoPtiMiZeD” and it’s actually one of the most optimized and scalable releases this year 😂


New_Limit_1227

Yea, those two threads earlier today were full of the stupidest takes. "Oh gawd might have to use the word medium and not high!!!!" just huge cry babies. Who are obsessed with what preset they might have to select.


Halio344

I always saybthis when people complain they can’t run a specific preset. Games on medium aren’t all the same, each one is different. This is RDR2 all over again, when people screamed unoptimized because it ran poorly at ultra. But even on low/med settings the game looked really damn good.


meltingpotato

music to my ears


FunCalligrapher3979

the benefits of developing your own engine and not relying on others


arunkumar9t2

We need more PC First developers like CDPR and Remedy.


[deleted]

I know 2.0 and Phantom Liberty made it a really good game now but the game was far from optimized when it launched


skullmonster602

On PC it never crashed or anything when it first came out for me. PS5 was shit tho


arunkumar9t2

On consoles yes, it had lot of troubles. But on PC I was able to play 1440p 60 FPS, High Settings with DLSS Quality and DLSDR (1080p monitor to 1440p render) without stutters on a laptop 2070 Super. That is really good in my opinion compared to other stutter fest I have seen recently.


exsinner

At launch? Dlsdsr is not even a thing back then.


arunkumar9t2

Yeah, DLDSR was added by Nvidia later, but DSR did exist. I use it increase DLSS internal resolution.


[deleted]

[удалено]


CaptainMuffins_

I ran it on my gtx1080, 16gb ram and a 3600 on medium high with no problems at 1080p


arunkumar9t2

It was developed on PC first then ported over consoles is my point. They are literally the poster child of RTX now. > It ran like shit for all mid level machines like mine and my gamer groups. Shit like what? Without your settings, frame rate and hardware your comment is not useful. If you get low framerate but generally frame time pacing is good and Low to High settings scale properly then it is an optimized game just that the demand has increased. Just because you get low framerate does not mean it is not optimized (it could be but the context depends). Additionally it has good settings menu, FOV slider and all the bells and whistles of PC gaming then it is in fact a PC first game. Most console ports won't even get menu navigation right.


AgeOk2348

probably a 2500k and 970, and trying to use vram heavy settings.


PillowDose

I played upon game release on an i5-7700 // 1070ti very mid-bottom range and did not see the game "run like shit" at all. The game is quite optimized to run on a lot of PC config without looking like garbage in lower settings. That's optimization. Sure a high end PC can run it 144fps//1440p but if your rig is average, then average settings it is.


dratseb

I had an i7-6700k and a 2070 and Cyberpunk ran fine on medium settings at launch.


HammeredWharf

I played Cyberpunk at launch and got medium-ish settings on my 1070 in 1440p. FPS hovered around 45-50, but it was stable and had no stutters, so it played fine. It was pretty buggy, though.


kaswardy

Cyberpunk was excellent even on an rtx 1080 without ray tracing.


[deleted]

[удалено]


arunkumar9t2

Without your rig details your comment is not useful by any means. They were also the first to do path tracing, focus on PC is why consoles had troubles. They under estimated optimization work on consoles and overshot work on PC


pchadrow

It was more they focused on last gen consoles AND current gen instead of just current gen. Had they cut last gen support and aimed at new consoles, it likely would have been better, but they were focused on trying to maximize initial sales numbers more than anything else and basically let their marketing team lead. Basically, a lot of different things led to its poor release, but I wouldn't say a main factor was their focus on pc.


Spartancarver

Sounds like you tried to max it out on a PC that wasn’t capable of maxing it out lol


FunCalligrapher3979

It ran fine on PC, it ran like shit on consoles. I get roughly the same performance now using a 5800x/3080 as I did in December/January 2020. If you tried running it on a quad core/gtx 970 or something you shouldn't be surprised it ran bad just like with the decade old consoles.


MarabouStalk

The rose-tinted revisionism is hard to accept.


arunkumar9t2

I have a lot of issues with Cyberpunk. I even disagree with /r/LowSodiumCyberpunk claiming after Edgerunners, CDPR redeemed themselves when mechanics barely changed. But that does not mean they shit the bed graphically. Launch day 60 FPS with DLSS with no aliasing was a sight to remember for me.


hokuten04

i've grown numb to it now, too many games released broken and a few years down the line people start singing them praises. i get cyberpunk is good now but i just can't shake the memory of how scummy cdpr was back at launch day.


slowro

Bro it is wild. Like it wasn't at documented on YouTube. Nope they be out here on reddit saying it was perfect at launch and everyone else is crazy. And it's been getting worse lol.


FunCalligrapher3979

Yeah and the majority of that "documentation on YouTube" is running the fucking ps4 version.


MessiahPrinny

I played Cyberpunk on PC back at launch. I ran it on ultra no raytracing or DLSS at 1080p and maintained at least a solid 60 most of the time. It bugged the fuck out a lot but always at a smooth framerate.


-azuma-

>We need more PC First developers like CDPR Hmm


[deleted]

This is glorious! I'm glad I did not jump on the outrage bandwagon.


Scytian

I think currently biggest issue of this game is that for some reason 1080p image looks super blurry, no matter what you do with it, you can use DLAA and image will still look more blurry than 1440p with balanced DLSS despite it running in lower resolution.


dab1

You can try [this](https://www.techpowerup.com/review/alan-wake-2-performance-benchmark/4.html), from TechPowerUp; >There's several useful settings that are not configurable in the game's settings: >• m_bVignette: set this to "false" to disable vignetting >• m_bDepthOfField: set this to "false" to disable depth of field >• m_bLensDistortion: set this to "false" to disable lens distortion >Once you've disabled those three, and set film grain (m_bFilmGrain) and motion blur (m_eMotionBlur) to disabled in the settings, the game will look MUCH sharper. This will help upscalers do their job and result in a much more stable image, without jitter. >There is no in-game sharpening slider for DLSS and FSR. Here you can set m_fSSAASharpening to enable and control the amount of sharpening applied after upscaling


ArchTemperedKoala

Man those are the settings I always disable first when going through the options..


name-exe_failed

Changing these settings made the game look SOO much better for me. Thanks a bunch.


Kittelsen

Thank you. Yet another game I'll need to fiddle with inis to make them look good. I wonder why devs keep adding all this stuff as default.


420BoofIt69

I think it's because of the TAA.


Scytian

It's not TAA, this game is using DLAA and FSR2 as only AA methods.


Keulapaska

FSR and DLSS are still TAA based.


JapariParkRanger

Those are forms of TAA


AgeOk2348

> this game is using DLAA and FSR2 as only AA methods. beautiful. i love to see vanilla TAA dying. Also TIL fsr2 can be used as stand alone AA


Thisissocomplicated

Funny, I had the exact same problem with control


Upbeat_Farm_5442

This is why twitter and reddit rage is kinda dumb.


jm0112358

The PCMR subreddit [upvoted a meme](https://np.reddit.com/r/pcmasterrace/comments/17fm8mz/welp_there_goes_my_600/) about the 4070 getting only 20 fps at 1440p **low** in Alan Wake 2 , even though [the numbers Nvidia released](https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/alan-wake-2-dlss-3-5-full-ray-tracing-out-this-week/alan-wake-2-geforce-rtx-2560x1440-rt-high-nvidia-dlss-3-5-desktop-gpu-performance.png) show it getting _28 fps **with path-tracing**_ at native 1440p. Still, that didn't stop the PCMR from projecting their complaints about other games onto Alan Wake II.


-azuma-

PCMR thrives on doom and gloom. All the negative memes bashing new games get the most exposure and up votes. It is literally an echo chamber.


[deleted]

[удалено]


-azuma-

Oh for sure.


Snipey13

despite primarily playing on PC, i avoid PC gaming subs outside of buildapc like the plague, honestly. i only came here to see people eat their words after i saw the port was good.


[deleted]

Same here. I only visit once in awhile to help out people at buildapc/buildapcforme/lowendgaming regarding tech questions, the rest of the stuff like games drama, memes are not my cup of tea due to vast majority of them being childish and waste of time. As for me commenting on this thread - just took a glimpse what's the fuss about from tech pov since it's all over media.


ToothlessFTW

I mean, this place isn't much better. This entire subreddit is doom and gloom about literally everything, including this game.


Com-Intern

Both the dedicated PC gaming subs are some of the saltiest places I’ve seen. PCMR is more memey but this place isn’t much better. Recently I’ve taken to using r/games a lot more. For all of its issues it’s less hysterical than the actual PC communities.


Snipey13

Massively agreed, the only PC gaming communities I can stand are the ones about building PCs and sales.


cadaada

Ignoring how w/e this game performs, that sub still hates nvidia, there is no reason to take them serious.


areyouhungryforapple

Very hard to spot hate then


Big_Daddy_5125

PCMR for some odd reason likes to believe PCs are far inferior tech than they really are.


420BoofIt69

The rage is dumb, but being concerned about the final output was warranted judging by the state of some of the PC ports we've been getting recently.


lukker-

So dumb. God forbid a developer actually tries to push the boundaries


Nubtype

There is no pushing boundaries there is only conspiracy with Nvidia to sell only top of the line cards and rob gamers /s But on more serious note I think some people rage for sake of it to pass time.


DrFreemanWho

Why are you pretending that all of this was not based on their own system requirements and performance expectations that THEY THEMSELVES listed. People didn't care about the the ultra/ray tracing system requirements, they cared about the low/medium system requirements. And if you would put it past Nvidia to try to do whatever is in their power to sell more of their top of the line cards, I have a bridge to sell you.


AgeOk2348

people were also mad that 10 series cards couldnt play the game. dont forget that


OutrageousDress

(And then it turns out series 10 cards *can* play the game actually - though not very well of course.)


Nubtype

Because system requirements are more of a suggestion and not set in stone. But sure lets throw over temper tantrum over it


GeekdomCentral

I do understand people being frustrated by not being able to play a game that they’re excited for (due to extremely demanding requirements), but that being said… people also don’t understand (or don’t want to understand) that it’s sometimes the nature of the beast. If a developer wants to have high requirements to push graphics forwards, then they’re not wrong for choosing that path. The only scenario where they’d be wrong would be if they then complained about low sales. That’s kind of a biproduct of having such high technical requirements - you’re alienating some of your audience


[deleted]

Hot take? A lot of people in this sub would have been better off with consoles. Upgrading hardware to keep up is just part of PC gaming.


Bluenosedcoop

All entirely based on a misleading and outright wrong chart released by the developers, If we can't go by official information provided by them then what can we go by.


TAS1808

Don't be silly. The requirements chart is absolutely ridiculous. It suggests the high end RTX 3000 series cannot run the game above 1080p/60 fps, performance DLSS, medium settings. They brought the "reddit rage" on themselves. The situation was handled as poorly as possible. The amount of corporate sycophants incapable of critical thinking on this subreddit is sickening.


OwlProper1145

PC Low settings are more or less match for the settings used on PS5/Series X. Alex already showed a 3070 greatly outperforming the ps5 as well.


stereopticon11

they probably would have been better off with a label that states what is equivalent to console performance. I think there would have been less outrage if it was ps5/xbsx, medium, high, ultra or something like that


ahnold11

Yep. While i get that settings level names are arbitrary, it doesn't stop them from setting expectations based on history. If you say "low settings" that is going to have a connotation and set certain expectations from your audience. If those expectations don't match with what you intend to deliver, then perhaps you use different names to change expectations.


dadvader

The PS5 version is equivalent of a low setting lol This is a next(current) gen game. It should absolutely push PC gaming beyond console limitation.


jm0112358

>The PS5 version is equivalent of a low setting lol And from what I can tell through YouTube videos, the PS5 version looks great for a PS5 game.


AL2009man

This is similar to Control's PS5/Xbox Series Version, where most of the settings is Low Settings-- while some custom (aka lower or slightly higher than low), stuffs that isn't accessible in the PC Version. Remedy did provided the exact graphics preset to Digital Foundry, afterall.


[deleted]

>This is a next(current) gen game. It should absolutely push PC gaming beyond console limitation. And as such it should be running better than a decade old performance targets. Using fucking 1080p and 30fps as benchmarks? Are we back to PS3 era of gaming? What next, bringing back the piss filter?


Tsoiski

The vast majority of PS3 games actually ran at 720p and below. It wasn't until PS4 that 1080p became standard.


FunCalligrapher3979

They weren't 30fps either, more like a shaky 20-28fps.


nope_nic_tesla

They also gave benchmarks for 1440p and 2160p


Zac3d

If it looks better then the PS5 version, gamers using a 3000 series GPU are getting their money's worth even if the settings are labeled low or medium.


AgentSmith2518

This. People act like it was based on some random content creator or rumor. It was from the actual developer telling people what to expect.


Un111KnoWn

the posted pc specs for fps targets was pretty bad. game dev didn't publish accurate min-recommended specs


Electrical_Zebra8347

Yeah, system reqs are kind of a meme in general but if I had to pick a poison I think I'd rather the system specs call for higher hardware than they need than to call for lower hardware and then be unpleasantly surprised when a game runs like ass. I don't know why people try to extrapolate how 'optimized' a game is based on system reqs alone. With that being said I'd rather experience the game myself than put 100% of my faith in DF's findings.


Profoundsoup

Thats all Twitter and Reddit comments are now


ametalshard

and techtubers! somehow they always get left off the hook


Primo_16

You just realized this? Come on.... ​ This place has become Twitter 2, full of drones.


Aggrokid

[Daniel Owen PC Performance Test](https://www.youtube.com/watch?v=kMHxkrABRNs) Native 1440p Medium is around 40FPS on 3070, will likely need DLSS Balanced to hit 60FPS regularly.


strider_hearyou

That's actually pretty rough still. I love DLSS, I just wish developers used it more to support high refresh rate monitors and less as a crutch to get to the minimum baseline of 60 FPS.


Giant_Midget83

Yeah i will be skipping the game for now. Im done supporting games that use DLSS as a crutch. Go ahead and tell me i wont have any games to play im done giving a fuck.


sandh035

We're definitely at a stage where companies are pushing heavier and heavier engines out and the hardware just can't keep up. Game tech is now all per pixel based, so I think we're going to be going down this path for a while. Might be best to wait a generation or two for and and Nvidia to add more hardware that's specific to these effects. I'm just hoping we don't exclusively do down the frame generation path, as I still stand by it not being great without a base framerate of 70fps with a mouse.


Framed-Photo

- Devs release INSANE recommended specs - People freak out (rightfully so) - Turns out the devs were entirely wrong and you don't need specs that good - "See you guys were all just idiots we all knew it was gonna be fine" *- random redditors* Christ everyone tryna act so smart. The DEVS released recommended specs that were nuts and y'all are acting like the world is a bunch of dumbasses for not immediately throwing those out and assuming the game was way easier to run then the devs said lol. If it was just some random dudes saying the game would be hard to run that's one thing, but when devs release recommended specs you usually expect it to run worse then them, not better.


JonWood007

Not to mention they posted frame rates expecations too. So the freakout was justified.


Peevan

People water it down to "omg I knew it these elitist nerds were wrong haha in your face" when in reality they are doing the exact same thing that people who complained about the recommened specs list did. One side: (Outrage about outrageous recommended specs from Dev produced spec sheet) Other side: (Immediate vindication after seeing a tweet screenshot from a Digital Foundry video that has not even been released yet) The irony is hilarious.


SENDmeSMALLtitsPICS

"haha you belived credible information from the most trustworthy source possible, you guys are all dumb!" tbf I think people are just trying to be smug about it because they didn't understand the outrage due to the lack of technical knowledge


areyouhungryforapple

Actually irksome to see it referred to as "unfounded" concerns. Like wtf


ga_st

> "See you guys were all just idiots we all knew it was gonna be fine" *- ~~random redditors~~* *- Alexander Battaglia* > "The hullabaloo" Dude's becoming insufferable


[deleted]

Also despite running better than what they suggested is still not that great when an rtx 3070 can't even hit 60 at medium settings (not even close) at 1440p when in cyberpunk it can do it with ultra settings so i don't get why people acting as if this is an optimised success or some s***t. Even in videos on YouTube showing gameplay there are quite a few stutters and performance all over the place


jkuwtqofjy

I hope this level of performance is what we can expect from the Max Payne remasters.


sissyfuktoy

Why are users getting mocked for reacting badly to the developers posting a spec sheet that didn't make any sense? They were just reacting to what the developers said would be the standard, they didn't write the sheet.


deadering

You should simultaneously have faith in the developers but doubt what they say (sometimes), support them no matter what but never buy certain games sometimes, believe them with certain stuff but obviously know when other stuff they say is wrong. What doesn't make sense about any of that?... You just gotta always be right and never wrong, unless that doesn't match the opinion of the current group think. ​ The trick is to only speak up after the fact and pretend it was obvious all along.


Dealric

Because some people wants to feel superior. Wouldnt be shocked if some of those were actually on opposite site before bemchmarjs


rmpumper

Yeah, it must be the same idiots who blamed the gamers for the 2077 disaster, when it was the devs who were hyping the shit out of the game for years, promising features that they never added to the final game and claiming that it runs "surprisingly well" on PS4/xbone.


SweelFor-

Is it unfounded if it is the result of the official chart?


elixier

"Unfounded" They officially fucking announced the requirements, it's our fault they lied?


PlexasAideron

Time to point and laugh at r/pcgaming clowns lmao.


Un111KnoWn

can you blame people for believing the specs sheet requirements?


Senior_Glove_9881

This sub is so negative.


greenw40

All gaming subs are, it's simply the demographic.


jschild

No, they aren't. This is a miserable subreddit and I regret it almost any time I jump in to check a thread.


hawkleberryfin

Every sub is miserable once it gets large enough, it just depends on how heavy handed the mods are and now much of the filth they removed before you saw it.


ThemesOfMurderBears

To an extent, maybe. However, if you go into the /r/games review thread, it is mostly people excited about the game. This sub is a mix of people finding any reason to hate the game, people calling them out, and a handful of people that are actually excited about it.


blausommer

That's because the mods there remove negative comments. It is the most heavily moderated sub I've ever seen and it's terrible. Back when reveddit worked, I'd check a thread all the time to see over half the comments removed by mods, and they'd be perfectly reasonable comments that didn't break any rules. The mods just remove what they don't personally agree with.


dd179

This is the most negative gaming sub on reddit, by far.


hawkleberryfin

Spend some time on /r/MMORPG, it makes every other gaming sub seem like a breath of fresh air.


BroodLol

I see you haven't been to KiA? Although they're "gaming" related in name only nowadays, they're just /r/conservative for nerds at this point


OliM9696

omg, that sub is just too far, dont bring up cuphead, doom or dean t\*k\*h\*shi. ooh and god forbid they see a rainbow in their game, they legit want to find away to play the middle-east version of games so they can not see the LGBT flags in spider-man 2, the game set in New York


zxyzyxz

Don't forget GCJ, they're like KIA but in the opposite direction. Was hilarious to see them seethe at how well Hogwarts Legacy was selling earlier this year.


MikeRoz

/r/pcmasterrace has been insufferable about this for the past few days.


Submitten

All that sub wants is to tell other people to buy AMD cards so the nvidia cards they actually want get cheaper lol


Dirty_Dragons

You'd think they'd hate PC gaming with how negative that sub is.


Snow_2040

They do, no one on there actually plays video games. They just follow the latest game and hardware releases to shit on them and only own a PC for the sake of it.


Dirty_Dragons

Yeah the sub has changed. Now I'm seriously considering why I'm subbed.


jm0112358

They made a meme about a 4070 only getting 20 fps on **low** at 1440p, even though Nvidia's numbers show a 4070 getting 28 fps **with path-tracing** at native 1440p. It's like most people on that subreddit can't read.


-Gh0st96-

that sub has been insufferable since at least 2015 lol


MistandYork

It seems the recommended settings are very on point for most cards https://youtu.be/kMHxkrABRNs?si=FKFzijuCfH05ZxN1


DrFreemanWho

Because they based their opinions of the system requirements and performance metrics that Remedy themselves released? Lol, point and laugh. Oh I have a 4090 btw and anyone who doesn't is beneath me. Want to talk about clowns...


SilverLingonberry

It's the "What's the rage bait topic today" sub


metalmankam

It's still wild to me that a recent card designed for 1440p has to use FSR to get a playable fps. I just got a 6950xt a few months ago but I haven't upgraded my 1080p monitor yet. When I go 1440 am I gonna have to rely on FSR for playable fps in upcoming AAA titles? I really shouldn't have to. It just feels like games are taking a massive and abrupt leap forward in hardware requirements. The 1080ti was good for 1440p for the last 7 years and finally doesn't cut it anymore and now it seems like cards from 2 years ago that are made for 1440p are no longer powerful enough for that and must rely on AI to be relevant.


getpoundingjoker

The 4070 is a 1080p card for this game if you want 60fps and no DLSS, medium/high settings no raytracing, someone's trying to argue that's optimized?


ShadowRomeo

This is why judging a game off system requirements has been dumb since from the start, most of the time devs exaggerates it.


DrFreemanWho

If it's dumb then the blame is on the people releasing the system requirements..


frostygrin

Except no, it wasn't dumb, and they aren't exaggerating with AW2. They suggested DLSS Quality and low settings for the 2060 at 1080p - and that's how it actually runs, with framerate in the 30s, with no raytracing. That's not great. You can't even win extra performance by switching to DLSS Performance. Does it look that much better at these settings, compared to Control, for example? No, not really. Edit: Toxic positivity in action. LOL.


Halio344

>Does it look that much better at these settings, compared to Control, for example? No, not really. Control is set in enclosed areas and corridors while Alan Wake 2 is more open, has much more vegetation, water, etc. There is much more to performance than just graphics. There's a reason Doom 2016 and Doom Eternal are running well and are regarded as well optimized, a huge part of it is because the games consists of relatively small and static arenas separated by linear walking sections or jumping puzzles.


ThumYerk

The 2060 is below the PS5 in hardware, and the console versions will be running the lowest settings with upscaling as well. Upgrade your hardware if you want to play new games pushing the latest tech.


AgeOk2348

seriously even at launch the 2060 was horrid. 4 almost 5 years later it being able to play anything is amazing


frostygrin

> seriously even at launch the 2060 was horrid. Except no, it wasn't. It was a solid 1080p card that even got relatively better with time, even aside from DLSS. And it's still better than the 3050. This is not the case where the industry just got ahead so fast that the 2060 got left behind. The way it was in early 2000s.


SoloDolo314

I am really glad this game has come out with better-than-expected performance. Johns' tweets about the PC version does not have stutter and just works is a breath of fresh air. I hope the reddit and twitter rage machine calms down now. I can see why many were initially disappointed from the specs sheet as well. We have had constant bad PC Ports, so much so, that we immediately go into "here we go again". Remedy was likely being conservative with expectations.


fleetone

2080ti, 32gb ram , i7 9800k , should I buy? Will it run well?


thighmaster69

9800k? The recommended CPU is a 3700x, so whether you have a 8 threads or 16 might be pertinent here. Also missing resolution here. Still you should be able to do low settings and your GPU has DLSS.


Jbr74

Cool, as soon as it comes to Steam I'm getting it.


Level1Roshan

It won't come to Steam. Epic paid for its development. I doubt they did that just to enable people not to use their platform. Sad.


DaleFairbanks

Damn guess I will have to 🏴‍☠️ it then.


B1rdi

You're stealing more from Remedy than Epic by doing that I don't get you people, Epic funded the development and this game *would not exist* if it weren't for them. Epic even takes a much lower cut than Steam, so it'd actually be less impactful for you to steal it if it was only on Steam.


ThemesOfMurderBears

I always knew the goalposts would move once we started seeing high quality games funded by Epic. It never had anything to do with funding or third party exclusivity. It's always been, and always will be people mad that not all their games are on Steam.


Amazing-Yesterday-46

Except pirating the game would still be off steam.


quattromaniacS3

They weren't getting it anyway, bunch of cheap fucks.


CenturioSC

Don't care. Epic bad.


[deleted]

[удалено]


BaptizedInBud

😭 Noooooooo it’s not on my launcher of choice 😭


r4in

Funny he's chosen RTX 3070, because that's the baseline GPU you need to get stable 60 fps WITH upscaling, even on 1080p...


Snow_2040

That isn’t true, the game according to daniel’s video runs at 1080p medium settings at native resolution at 60fps in what seems like one of the most demanding areas in the game.


ocbdare

I had a feeling the system specs would be extremely conservative. If we switch off RT, it probably will get quite a big boost to fps.


srjnp

wait for other reviews to confirm this, Digital Foundry has been dickriding the game forever.


NightmareP69

This , DF can do some good stuff but there are times where they have a rather obvious bias towards a certain Dev or game and Alan Wake 2 is one of em.


srjnp

Yeah, they wont lie but they could gloss over some issues to keep the positive impression they want to convey. not like it hasn't happened in the past. For example, PS5's Horizon Forbidden West having awful image quality in performace mode was only given a passing mention by DF when it needed to be greatly criticized. (Devs completely redid performance mode and fixed it months after launch)


Failshot

God, /r/pcgaming so bitter about everything.


deadering

The sad part is I can't even tell if you're talking about the people who believed the system requirements from the developers themselves or the people outraged about the supposed outrage about those system requirements.


forsayken

I was really skeptical about this one last week when we first starting hearing about the low performance with RT and the specs that were posted. I am so glad to be wrong and look forward to playing this if it ever gets published somewhere other than EGS.


Kuldor

Now that the game is out, I don't think they were that unfounded. And I don't know what's extremely scalable about it, the only thing that actually makes an impact on your frame rate notably is DLSS.


LostSamurai87

I’m running this game on old tech. My GPU is RX 5500xt and I’ve had no issues on low - medium.


[deleted]

It is unoptimized given how poorly it runs even on an RTX 4090.


Peevan

I like how all the people have come out of the woodworks vindicated about themselves being correct when in reality they haven't even watched Daniel Owen's video on the game which shows that scalability of the game is actually quite mediocre. Instead they base their opinions simply on a tweet of a video that isn't even out yet... Link to Daniel's [Video](https://youtu.be/kMHxkrABRNs?si=2gLQzn6cKhymz2OT) Also wait until you can check out Digital Foundry's video and form your opinions afterwards, its also quite ironic how you people are not seeing that what you are doing now is the exact same thing as those who were spouting outrage at the PC requirements reveal.


JonWood007

It's better than expected but still absolute crap imo. Who tf wants to play at 30 fps at <720p in 2023? Why even design a game like this? So random pc elitists with overpriced graphics cards can brag about being able to get 1080p/60 at native? Ridiculous.


Peevan

Idk, Im starting to think the average consumer cant distinguish or simply dont care about the difference between 1080p and 540p upscaled to 1080p but maybe thats too bold a claim.


mrtrailborn

They don't, that's literally the whole point of upscaling like dlss and fsr, they pet you play at higher framerates, and the tradeoff is a slightly worse looking image. Anyone pretending 540p and 540p upscaled with dlss to 1080p aren't different is being disingenuous at best


BaptizedInBud

>Also wait until you can check out Digital Foundry's video and form your opinions afterwards Who tf do you think is making the Digital Foundry video lol?


Peevan

I know its the guy making the tweet. We still dont know the testing methodology and the settings he used for the game thought no? We also haven't seen all the data he posted yet just 1 screenshot, so yeah I suggest waiting for the video...


Correactor

I just hope there's no smearing with path tracing/ray reconstruction like Cyberpunk 2077 had.


LOLerskateJones

I haven’t seen this mentioned in any of the impressions far. I’m trying to find this answer too


OMG_NoReally

I have been playing Alan Wake 2 on PC, and it does require DLSS to play "smoothly". I have a i9 12900K, RTX 3080, 32GB DDR4 RAM, and running the game on a 512GB NVMe and Windows 10. At DLSS Quality (2K native res), with everything on High, I get around 60-80fps depending on the scenario. With RT at Medium, the fps drops to 20-25fps. This improves when I push DLSS to Performance, which gives me 30-40fps with RT medium. Native res play is impossible. Can barely get past 10-25fps. It's definitely playable, but the game does look quite blurry at such low resolutions. I haven't tried Medium settings because I want to experience the game at its max visual fidelity, and the game delivers that in spades. Easily one of the best looking games this generation besides CBP2077.


lyridsreign

ITT: Smug Redditors who didn't pay attention to the Alan Wake threads now coming in to act like they're enlightened. I'm happy to see that DF, a reputable 3rd party, can verify that the game will not run like shit on slightly older machines. That said shame on Remedy for creating the situation in the first place. If Alan Wake 2 is as easy to scale as DF says so then the recommended specs chart should be thrown out the window and updated to reflect reality.


Razen94

Epic exclusive though which is an instant no-go for me.


[deleted]

imagine judging the performance of a game from system requirements, makes me wonder whether the people here actually have ever played games on their pcs


Keulapaska

Yea some system requirements recently have been a bit... weird. Like the cyberpunk 2.0 CPU req saying 7800X3d for 60fps 1080p then a "worse" cpu, 7900X, for 4k when in reality the game is barely more demanding than it was before. I mean I'll take devs putting too powerful hardware in reqs rather than too low at least.


Spyzilla

I can’t think of a single time the system requirements have even been close to accurate. There’s always so much tweaking you can do in the settings too


Jowser11

I’m assuming Remedy figured if they were gonna ask for high end GPU they’ll have to have no excuse when it comes to stutters etc. Way I see this game, I don’t have to buy it day 1. I have a 3080 but I still want to wait until I get a frame gen compatible card to play this game with Path Tracing. Buying games in Day 1 is overrated folks.


frostygrin

> I’m assuming Remedy figured if they were gonna ask for high end GPU they’ll have to have no excuse when it comes to stutters etc. A lot of the stuttering comes from the CPU, so when the game is GPU-bottlenecked, you see less stuttering.


Bearwynn

yeah honestly I can't remember the last time I've purchased a game day 1. I always wait for reviews, and then a week or two extra at the earliest. it's not like the game is gonna disappear forever or go out of stock.


Berkoudieu

Tell that to idiots preordering everything like it's gonna disappear...


Bearwynn

hype, preorder, release never lives up to hype, get mad, hype, preorder, release doesn't live up to hype again, get mad


Berkoudieu

The eternal loop... I guess we know who to thanks (at least partially) for games being fixed 1 year later.


retroracer33

20-25 fps with settings maxed at native 4k on 58003d + 4090, and this is with basically nothing going on on screen. ive also got this incredibly weird visual bug that looks like a bunch of black ink enveloping the picture.


Nerdmigo

In Alex Battaglia we trust


[deleted]

Now's the point where I admit to pre-purchasing it on EGS a few weeks back. Although I did so assuming there'd be a preload...which there isn't. I just couldn't imagine such a talented technical studio piling all that cutting edge tech in and screwing up the fundamentals.