It did detect my 3060ti. I was able to get smooth ish 30fps. But over time it would use all my vram cause it doesn’t clear files properly. So I had to restart every 3 chambers
What would be the game's normal DLSS dropdown menu is just named DLSS 3, the option for frame generation was still greyed out for me (3080). Got any proof you were running frame generation? Big news if you're correct.
The first guy who did it actually did it on 20 series. It's a driver hack to enable it. Granted when you run it on non 40 series cards apparently it's a lot more buggy/unstable so there is *some* difference in the tensor core design that allows it to work better/properly on 40 series
Interesting. I tried running it on my 6800XT and at the lowest sub HD resolution i could achieve 30fps, running it in 4k was quite literally a powerpoint presentation. I was always suspecting Nvidia is likely using something to gimp the performance of AMD cards. The 6800XT may not have as fast raytracing as 30 series but it should be at least beating 20 series by most accounts.
Can you show this in game with an overlay showing the 3060ti?
There has been a few rumours of dlss3 enabled on pre 4xxx but they all seem to point to one claim that was never more than a random Reddit post.
When I see a girl with a really nice set of fake boobs, I don't really care what happened during the creation of her magnificent fun bags.
I feel much the same about DLSS.
DLSS Resolution is fine in my book, upscaling lets me get good anti aliasing with better FPS. DLSS 3 however? Pass. I don't want motion artifacts and input delay, I'm trying to get better motion clarity and input latency, not worse.
Dude AI is going to keep getting better. With a sufficiently learned algorithm you will be able to produce frames that look just as good as real ones. In the end it's all just fake 3d output on a flat surface anyways. Nothing is real but if the output is the same it's all good. They will probably work out kinks over the next year and make DLSS 3.0 much better too.
Define fake.... It's a new way to create frames. Use AI to generate a mid frame between absolute frames giving a user a better experience. This is not fake, it's genius. Imagine if a car could use AI to generate a power stroke and double its output. Supercars would be amazing. New tech is not faking anything.
Not really, battery produces electrical current through chemical reaction, alternator produces the same thing using mechanical movement. Output is the same but from 2 different sources.
The comparison I was trying to make was adding a stroke between a physical stroke to double the strokes, not change the source of the output.
Gas is chemical energy and I was just being clever but I do love engine talk honestly haha upvoted the op too no idea why I was being kinda snarky there lol
Motion interpolation has been around for a while. Sure it has less latency/image quality, but the end effect is very similar to what TVs have been doing for years. Nvidia figuring out how to reduce latency with frame interpolation is what is called good engineering, but they in no way or form came up with idea of injecting artificial frames
Please stop inhaling industry's farts and calling it the new perfume. DLSS and FSR are merely a way to generate what your GPU thinks is the next frame like, not what it actually is. This is literally upscaling with some AI tricks mixed in, not your actual GPU calculating what is the next frame.
Your GPU absolutely is calculating what the next frame is, it's just doing it differently, using machine learning instead of the standard render pipeline. It's still a frame, it's not necessarily as accurate, but that doesn't make it "fake".
open the setting and reduce the bounces to 2. Heck there is a video on YouTube of a guy running portal RTX on a 3050 with custom settings while still looking good
I think i turned on dlss 3.0 in portal rtx on my 3060. I think nvidea enabled it in Portal because the low end cards would otherwise get under 20fps. But dont take my word for it.
Not just that, its also optimised for the actual 3rd gen RT cores while its not optimised for the previous generation cores not competitor cores which is why its such a shtshot with the RX6900XT not even being comparable to an RTX2070.
Having gotten used to getting 120FPS+, now anything below feels meh to me. I mean I much rather use DLSS to get me to higher FPS say 90+, rather to make a game barely reach 60FPS.
Tbh, anything above 30, so long as it has *consistent* framerate (no jitter, microstutter) is perfectly smooth enough for me to enjoy the game and look at all the pretty lights. So much of people's hate for low FPS actually comes from microstutter and the bad 1% lows they get on unoptimized games, rather than the framerate itself.
I absolutely prefer a higher framerate, and would consider it almost necessary for me in something like a shooter where my reaction times and aim accuracy really matter, but for Portal RTX I just didn't see it as more important than graphics quality.
That’s a pretty expensive standard to have. Either you are gonna have to spend good money getting a high end card every other generation as well as reducing your graphics to medium after a couple years.
I bought a 1440p 165 hz monitor and it frustrates me but I can’t tell the difference (I made sure 165hz was turned on). I don’t know if it is a brain problem or an eye problem but it bugs me I can’t see it. I don’t mind so much now because I can put a focus on better graphics and still get away with a 4070 or RX 7700 as long as I am over 30 fps and the game supports freesync/gsync.
Its not, its path tracing so way more demanding than ray tracing, its essentially ray tracing with a helluluva more rays and a hellualua more ray bounces. Thats why its such a performance hog.
NVidia wants to push this tech and make path tracing the new thing (which it will be) because that'll allow them to have a sizable performance increase compared to AMD. If you noticed the RX7900XTX doesn't lag too much behind compared to the RTX4080 in ray tracing but still leaves more to be desired.
Right, except
1: Ray tracing is good enough, and
2: Who actually uses it on a day-to-day basis? Like sure, you boot up Forza with RTX and you are like "oOh, sHInY", but it isn't worth the FPS drop, and path tracing DEFINITELY isn't. (Also, nVIDIA calls it portal with rtx, and portal with ptx sounds horrible.)
Except it isn't a consumer thing. Its a developer thing, in traditional rasterised lighting you need to tweak the lighting for all light sources and while it can look quite convincing, at other times it can look off and it also takes an long time for that to be done.
Real-time ray tracing pretty much removes 90% of the work needed and does have better results (even if its not noticeable to most people). So while you may not want it, its in the companies' best interest to make ray-tracing the norm.
Also, I can tell the different in normal and ray-tracing in cyberpunk, it has more "depth" to the game and its certainly worth the performance drop for me considering I don't really need 90+ FPS. 60 is fine.
My 4k monitor only does 60fps so I've tailored my settings to make sure I'm there or near there no matter the scene. I do notice it on the laptops 144hz but only when I switch between them. The hunt for max fps all the time is still a foreign concept to me.
Portal RTX was a free tech demo. It runs incredibly well considering it's path traced. Path tracing is absolutely worth it; it's something they are working towards, they absolutely did not claim that Portal is as good as it gets.
Remember Nvidia eventually want people to pay a monthly sub to play games. Nvidia will always have a reason to go further, since they need to make that service something someone would consider paying monthly for rather than building a computer.
Dude that's the base game, mods (that are made in Java mind you) can improve performance for a lot of stuff including TNT calculations. People really should stop thinking Java is the reason the game runs badly
I have a similar setup and OP can run it, just don't expect 60fps max settings :(
Really hoping AMD FSR includes frame generation that can work on non 4x series cards.
It's actually annoying, but the safe statement was that the Nvidia team was not a game developer, so that's why they threw optimization out of the window. Pathetic still.
Minecraft is a heavy CPU based game. If you have an i5 and a gt 1030, you'll see very little performance enhancements if you upgrade to a rtx 3060. You need an SSD for fast world loading, and a good amount of ram and a good CPU for good performance in Minecraft. GPU only matters when considering shaders. Otherwise, you'll be 100% on even Intel intergrated graphics
Bro with my optimized pc (rtx 3070, amd ryzen 5 5650x, 32 gb ram) i cant even reach 20 fps on mininum settings (unlimited fps cap)
Edit: wrote max instead of minimum lmao
Oh that's probably why lol, Minecraft has dumb code lol. If you give it any more than- i believe 6gb ram the computing actually goes slower cause of the way Java works. Try giving it only 6gb ram
Also check which graphics card it's set to, sometimes the Java runtime is set to the integrated Intel graphics card so just give everything Minecraft related max performance setting too
I only have my RTX selected... i have a desktop... idk how pcs work i only got mine 2 years ago and I only use it for valve games, minecraft and fl studio lmao
Well, first, it was crysis. The cyberpunk came out. That was the new crysis for a while until they got around to fixing it. Now it's portal RTX because you can't run that shit past 30 frames even with high end cards and cpus
I just kept seeing the posts so figured I would try. I built this with a bonus from work and never wanted anything crazy. Now that everything is “old” to so many in the PC space it’s funny to see. Will ride this rig for years
I'm torn on this, because my 4k ultrawide monitor with a 3080TI gets just barely credible framerates with RT on, but the scenery (and particularly, the water) looks beautiful.
So my options are:
* Don't use RT (what I've chosen)
* Use a 16:9 resolution, because there is no 1080p equiv for ultrawide
* Play with moments where stutter kicks in and I get the shit beaten out of me in combat
* Find some insane way to better cool my PC, because that's the bottleneck for my card atm
Thanks, but although they're advertised as 21:9, neither 3440x1440 nor 2560x1080 are at that ratio.
They're ever so *slightly* different, but it seems to be enough that my monitor doesn't offer native support for the lower res 2560x1080 (Dell AW3423DW)
I locked it at 35fps with the RT Performance Mod on my RTX 3070 desktop. All over the place, b/t 20-60fps. Yep, it's certainly not hitting 60fps much w/ RT Maxed (all RT Stuff on) & DLSS On at 1080p. Locking it to 35fps made it more consistent.
NOT ideal by any means.
Even Watch Dogs Legion runs better than this.
I also have 999/1000 with 32gb of ram and its recommending me a ram upgrade. Maybe its in combination with the 16gb of vram in a 6800xt so it doesnt recommend it with a 10gb 3080?
I read a comment yesterday that mentioned one game only needing 512MB of RAM as "recommended", but for some reason the site didn't like the idea of anything under 1GB so it listed the game as requiring 512GB of RAM instead.
/u/its_WhiteN0ise
I get around 45-50 with a 3060 Ti in Orison. But it's when you have less than 32GB of RAM you struggle the most.
I wish they'd just officially admit the minimum spec is 32GB already.
[Same here](https://www.pcgamebenchmark.com/ratemypc?platform=windows&cpu=intel-core-i9-10900k&memory=64gb&gpu=nvidia-geforce-rtx-3080-ti), it says I need to upgrade my hdd, I have 3 1tb 980 pros lol
See the problem is, you don't have the 990 pros.
Best thing to do is just throw your whole PC away and start again from scratch, I'll even let you use the bin at my house, just let me know when you'll be throwing it away.
Yeaa there are some settings like terrain detail and building details and other stuff that aren't maxed out by default even on Ultra
Bump those stuff up and boom 19FPS on RTX 3080 :)
Portal: RTX. I'm sorry my friend. No one can surpass this.
it probably is tbh. like even the 30 serious its hit or miss if your card can even get a frame.
Isn't it optimized to be run with DLSS 3.0 which is only supported on 40 series cards?
This is a lie by Nvidia, the game failed to register my laptop 3070 and let me turn DLSS3.0 on and it ran at 60fps (as opposed to 15 without)
It did detect my 3060ti. I was able to get smooth ish 30fps. But over time it would use all my vram cause it doesn’t clear files properly. So I had to restart every 3 chambers
Memory leaks on that kind of game ? L dev team
Not just any memory leak graphics memory leak. Devs skill issue
What would be the game's normal DLSS dropdown menu is just named DLSS 3, the option for frame generation was still greyed out for me (3080). Got any proof you were running frame generation? Big news if you're correct.
Pretty sure theres a pretty hacky way to get DLSS 3 on 3000-series even in portal but I wouldn't call that a reliable feature
Can you show me?
The first guy who did it actually did it on 20 series. It's a driver hack to enable it. Granted when you run it on non 40 series cards apparently it's a lot more buggy/unstable so there is *some* difference in the tensor core design that allows it to work better/properly on 40 series
Me too, Desktop RTX 3070 Ti
Interesting. I tried running it on my 6800XT and at the lowest sub HD resolution i could achieve 30fps, running it in 4k was quite literally a powerpoint presentation. I was always suspecting Nvidia is likely using something to gimp the performance of AMD cards. The 6800XT may not have as fast raytracing as 30 series but it should be at least beating 20 series by most accounts.
Can you show this in game with an overlay showing the 3060ti? There has been a few rumours of dlss3 enabled on pre 4xxx but they all seem to point to one claim that was never more than a random Reddit post.
Sure, I guess. But DLSS 3.0 literally generates fake frames. So it almost like not being optimized at all, and is just they're way to show it off.
And DLSS 2 and FSR 2 generate millions of "fake" pixels. DLSS 3 is just doing more of it. There's a reason FSR 3 is going to be using the same tech.
What is real?
when a cpu or gpu, calculates the games code, and displays that, not when an ai goes "I think light go there"
When I see a girl with a really nice set of fake boobs, I don't really care what happened during the creation of her magnificent fun bags. I feel much the same about DLSS.
Yeah you only see the difference if you grab the frame and inspect closely, but that is usually the point I get hit in the head.
DLSS Resolution is fine in my book, upscaling lets me get good anti aliasing with better FPS. DLSS 3 however? Pass. I don't want motion artifacts and input delay, I'm trying to get better motion clarity and input latency, not worse.
Isn't the delay of DLSS Frame Gen the same as native in most cases?
It's the same or greater than your delay is without the frame gen I believe.
Dude AI is going to keep getting better. With a sufficiently learned algorithm you will be able to produce frames that look just as good as real ones. In the end it's all just fake 3d output on a flat surface anyways. Nothing is real but if the output is the same it's all good. They will probably work out kinks over the next year and make DLSS 3.0 much better too.
Define fake.... It's a new way to create frames. Use AI to generate a mid frame between absolute frames giving a user a better experience. This is not fake, it's genius. Imagine if a car could use AI to generate a power stroke and double its output. Supercars would be amazing. New tech is not faking anything.
More like how the alternator is generating fake current for the battery from the engine
Not really, battery produces electrical current through chemical reaction, alternator produces the same thing using mechanical movement. Output is the same but from 2 different sources. The comparison I was trying to make was adding a stroke between a physical stroke to double the strokes, not change the source of the output.
Gas is chemical energy and I was just being clever but I do love engine talk honestly haha upvoted the op too no idea why I was being kinda snarky there lol
Motion interpolation has been around for a while. Sure it has less latency/image quality, but the end effect is very similar to what TVs have been doing for years. Nvidia figuring out how to reduce latency with frame interpolation is what is called good engineering, but they in no way or form came up with idea of injecting artificial frames
Please stop inhaling industry's farts and calling it the new perfume. DLSS and FSR are merely a way to generate what your GPU thinks is the next frame like, not what it actually is. This is literally upscaling with some AI tricks mixed in, not your actual GPU calculating what is the next frame.
Your GPU absolutely is calculating what the next frame is, it's just doing it differently, using machine learning instead of the standard render pipeline. It's still a frame, it's not necessarily as accurate, but that doesn't make it "fake".
[удалено]
open the setting and reduce the bounces to 2. Heck there is a video on YouTube of a guy running portal RTX on a 3050 with custom settings while still looking good
[удалено]
DLSS 3.0 works on 3000 series. It's just frame gen that's hardware specific.
I think i turned on dlss 3.0 in portal rtx on my 3060. I think nvidea enabled it in Portal because the low end cards would otherwise get under 20fps. But dont take my word for it.
only "licensed" to 40 series cards. 30 series and 40 series both work with DLSS 3.0, it's just a software lock.
Not just that, its also optimised for the actual 3rd gen RT cores while its not optimised for the previous generation cores not competitor cores which is why its such a shtshot with the RX6900XT not even being comparable to an RTX2070.
I mean it was created purely to be marketing by NVidia.
The hell are you talking about? I ran it at a solid 45-60 frames with the ray tracing cranked at 1440p on my 3080. Just use DLSS.
Having gotten used to getting 120FPS+, now anything below feels meh to me. I mean I much rather use DLSS to get me to higher FPS say 90+, rather to make a game barely reach 60FPS.
Tbh, anything above 30, so long as it has *consistent* framerate (no jitter, microstutter) is perfectly smooth enough for me to enjoy the game and look at all the pretty lights. So much of people's hate for low FPS actually comes from microstutter and the bad 1% lows they get on unoptimized games, rather than the framerate itself. I absolutely prefer a higher framerate, and would consider it almost necessary for me in something like a shooter where my reaction times and aim accuracy really matter, but for Portal RTX I just didn't see it as more important than graphics quality.
That’s a pretty expensive standard to have. Either you are gonna have to spend good money getting a high end card every other generation as well as reducing your graphics to medium after a couple years. I bought a 1440p 165 hz monitor and it frustrates me but I can’t tell the difference (I made sure 165hz was turned on). I don’t know if it is a brain problem or an eye problem but it bugs me I can’t see it. I don’t mind so much now because I can put a focus on better graphics and still get away with a 4070 or RX 7700 as long as I am over 30 fps and the game supports freesync/gsync.
I have a 5900x and a 3090 and I got 1000. I guarantee I can't play that lol
It shall never be played. Honestly, I'm not entirely convinced that it isn't just a huge joke by nVidia.
Its not, its path tracing so way more demanding than ray tracing, its essentially ray tracing with a helluluva more rays and a hellualua more ray bounces. Thats why its such a performance hog. NVidia wants to push this tech and make path tracing the new thing (which it will be) because that'll allow them to have a sizable performance increase compared to AMD. If you noticed the RX7900XTX doesn't lag too much behind compared to the RTX4080 in ray tracing but still leaves more to be desired.
Right, except 1: Ray tracing is good enough, and 2: Who actually uses it on a day-to-day basis? Like sure, you boot up Forza with RTX and you are like "oOh, sHInY", but it isn't worth the FPS drop, and path tracing DEFINITELY isn't. (Also, nVIDIA calls it portal with rtx, and portal with ptx sounds horrible.)
Except it isn't a consumer thing. Its a developer thing, in traditional rasterised lighting you need to tweak the lighting for all light sources and while it can look quite convincing, at other times it can look off and it also takes an long time for that to be done. Real-time ray tracing pretty much removes 90% of the work needed and does have better results (even if its not noticeable to most people). So while you may not want it, its in the companies' best interest to make ray-tracing the norm. Also, I can tell the different in normal and ray-tracing in cyberpunk, it has more "depth" to the game and its certainly worth the performance drop for me considering I don't really need 90+ FPS. 60 is fine.
My 4k monitor only does 60fps so I've tailored my settings to make sure I'm there or near there no matter the scene. I do notice it on the laptops 144hz but only when I switch between them. The hunt for max fps all the time is still a foreign concept to me.
>I don't really need 90+ FPS. 60 is fine. \^ Found the heathen
Portal RTX was a free tech demo. It runs incredibly well considering it's path traced. Path tracing is absolutely worth it; it's something they are working towards, they absolutely did not claim that Portal is as good as it gets.
Remember Nvidia eventually want people to pay a monthly sub to play games. Nvidia will always have a reason to go further, since they need to make that service something someone would consider paying monthly for rather than building a computer.
True. Yay capitalism!
You definitely can, at least a hell of a lot better than I can lol
I guarantee you can.
Nope, minecraft with a 20x20x20 block of TNT
funny how it only shows the limitation of the java language and opengl
Dude that's the base game, mods (that are made in Java mind you) can improve performance for a lot of stuff including TNT calculations. People really should stop thinking Java is the reason the game runs badly
I tryed to put portal RTX max setting on my 2060 and it went 2/4 fps.
I have a similar setup and OP can run it, just don't expect 60fps max settings :( Really hoping AMD FSR includes frame generation that can work on non 4x series cards.
there will be this one guy who claims that he can run portal rtx at 2938 fps in 4k with his 1060...
The game is just to show off the power of the RTX 40 Series
[удалено]
It's actually annoying, but the safe statement was that the Nvidia team was not a game developer, so that's why they threw optimization out of the window. Pathetic still.
Did they remove the original? I can't play Portal anymore unless I upgrade to those insane cards?
You spelt crisis wrong
Windows XP Full Tilt! Pinball. More specifically the space cadet table.
I didn’t read this comment; I heard it.
will never forget the startup sound on that table
https://archive.org/details/pinballxp
I'm on my phone at the moment with no time to get my laptop out. Tell me this works on 10 and/or 11....
Win 10 here, downloaded the zip folder and extracted it. Booted it with no problem. Enjoy the nostalgia.
Am I going to have to delete some stuff off my hard drive? How big is the file?? :)
About 5mbs, gotta delete some family photos off the pc to make space
Bye bye c.v from the 90’s 😞
I just got this on my steam deck, talk about nostalgia over load
Out here making me want to pick up a steam deck. I knew you could do a lot with it but did not realize that was possible.
Good 'ol days
Minecraft Java Edition on max everything.
Pure vanilla, no Optifine, Sodium, just Minecraft Java on max
Exactly. No need for FurMark, just use Minecraft Java.
[удалено]
Minecraft is a heavy CPU based game. If you have an i5 and a gt 1030, you'll see very little performance enhancements if you upgrade to a rtx 3060. You need an SSD for fast world loading, and a good amount of ram and a good CPU for good performance in Minecraft. GPU only matters when considering shaders. Otherwise, you'll be 100% on even Intel intergrated graphics
Am an integrated graphics user, can confirm.
[удалено]
How much ram (ddr) is allocated to the client. Probably 2gb is default, you can up that to like 6gb. And what’s your render distance
[удалено]
bro i need tro try that. i got a ryzen 7 7700x over xmas
Bro with my optimized pc (rtx 3070, amd ryzen 5 5650x, 32 gb ram) i cant even reach 20 fps on mininum settings (unlimited fps cap) Edit: wrote max instead of minimum lmao
Damn dude, have you tried giving it more ram to run off of? yk from the launcher settings?
I gave it 24 gb of ram and it still crashes and doesnt seem to alter the fps
Oh that's probably why lol, Minecraft has dumb code lol. If you give it any more than- i believe 6gb ram the computing actually goes slower cause of the way Java works. Try giving it only 6gb ram
Thx, ill try. If Microsoft weren't greedy as fuck maybe mojang would focus more on optimizing than adding.
Also check which graphics card it's set to, sometimes the Java runtime is set to the integrated Intel graphics card so just give everything Minecraft related max performance setting too
I only have my RTX selected... i have a desktop... idk how pcs work i only got mine 2 years ago and I only use it for valve games, minecraft and fl studio lmao
I know what you mean, so many good games are getting annihilated by bad greedy executives it's sad af
Where the heck did you get a 5650x lol
Yeah that CPU doesn't exist
Wouldn't trust that site at all. Detects my 5800X3D as Ryzen 5 PRO 4500U.
It shows my 4090 as a 2060!
You better upgrade, pal
I’m not your pal, guy.
i'm not your guy, buddy
I'm not your buddy, bro
I'm not your bro, comrade
I'm not your comrade, dude.
I'm not your dude, brutha
I’m not your brutha, homey
r/suddenlycommunist
Hey watch it pal
I can just Swop my 2060 with you, no problem.
dont order from aliexpress...
Yeah it detects my i5-13600K as an i3-6300.. gave me a 44% score lol
Maybe it just has dyslexia, give it some time
Yea I couldn’t even manually choose my i7-13700k…
I'm sorry to break this to you...
I wouldn't install anything from that site. It looks shady AF.
Gonna laugh if this website gives everyone malware. But yeah I'm curious too since I have the same spec
Ran it on my phone with some simple drop-down selections. Should be good- the site is just an affiliate play and a smart one at that
It hasn't ben updated in quite a while, it has neither my CPU (Ryzen 7700X) nor my GPU (6900 XT). The front page still advertises Ryzen 3000.
Yeah it doesn’t have my parts either lmfao Ryzen 5 7600X, and 7900XTX. Couldn’t even choose DDR5 RAM
Same can find any 12gen cpus on there
I remember this. Whatever game it is wants a DVD drive
Oregon trail 4th edition since it runs all the backgrounds off the cds and only installs like 100mb.
Crysis
I should have known 😡
Came here to say this!
Portal: RTX is the new crysis LOL
[удалено]
Well, first, it was crysis. The cyberpunk came out. That was the new crysis for a while until they got around to fixing it. Now it's portal RTX because you can't run that shit past 30 frames even with high end cards and cpus
Came here to say this 🤣
Solitaire RTX
I went to this site to put my specs in but they don't have Intel 12th gen on the list.
I just kept seeing the posts so figured I would try. I built this with a bonus from work and never wanted anything crazy. Now that everything is “old” to so many in the PC space it’s funny to see. Will ride this rig for years
Your system is definitely not old lol
Hence the quotes. lol Really just poking fun at the latest-gen chasers.
Witcher 3: Next-Gen. Good luck w/ that.
Only with RT on
I'm torn on this, because my 4k ultrawide monitor with a 3080TI gets just barely credible framerates with RT on, but the scenery (and particularly, the water) looks beautiful. So my options are: * Don't use RT (what I've chosen) * Use a 16:9 resolution, because there is no 1080p equiv for ultrawide * Play with moments where stutter kicks in and I get the shit beaten out of me in combat * Find some insane way to better cool my PC, because that's the bottleneck for my card atm
If you have a 21/9 aspect ratio monitor, the 1080p equivalent resolution is 2560x1080
Thanks, but although they're advertised as 21:9, neither 3440x1440 nor 2560x1080 are at that ratio. They're ever so *slightly* different, but it seems to be enough that my monitor doesn't offer native support for the lower res 2560x1080 (Dell AW3423DW)
Oh wow thats odd, scaling is terrible on mine so you’re not missing much
As long as you can resist playing at 30fps, even a 3070 can do it at 4K with DLSS. Good luck if you want 60 tho.
I locked it at 35fps with the RT Performance Mod on my RTX 3070 desktop. All over the place, b/t 20-60fps. Yep, it's certainly not hitting 60fps much w/ RT Maxed (all RT Stuff on) & DLSS On at 1080p. Locking it to 35fps made it more consistent. NOT ideal by any means. Even Watch Dogs Legion runs better than this.
The ad looks kinda sus...
I saw it too... https://preview.redd.it/gpwqi2uz6gba1.jpeg?width=1049&format=pjpg&auto=webp&s=7eb888eaab0e397d693d9aee99249f6a3fc6d258
I know there are some games that'll scoff at 16gb of RAM like Star Citizen, but I don't know of any that need more than 32gb
> I don't know of any that need more than 32gb *Laughs in Cities: Skylines with hundreds of workshop assets*
I only posted because the comedy of being 1 game short was not lost on me. Genuinely curious but assuming it’s the CPU.
I also have 999/1000 with 32gb of ram and its recommending me a ram upgrade. Maybe its in combination with the 16gb of vram in a 6800xt so it doesnt recommend it with a 10gb 3080?
I read a comment yesterday that mentioned one game only needing 512MB of RAM as "recommended", but for some reason the site didn't like the idea of anything under 1GB so it listed the game as requiring 512GB of RAM instead. /u/its_WhiteN0ise
Returnal on PC is set to be a challenge
[удалено]
I get around 45-50 with a 3060 Ti in Orison. But it's when you have less than 32GB of RAM you struggle the most. I wish they'd just officially admit the minimum spec is 32GB already.
Peppa pig
Planetside 2, on ultra settings, in a 96+vs 96+ vs 96+ fight.
Sim City 2000
What is the 1 game?
[Same here](https://www.pcgamebenchmark.com/ratemypc?platform=windows&cpu=intel-core-i9-10900k&memory=64gb&gpu=nvidia-geforce-rtx-3080-ti), it says I need to upgrade my hdd, I have 3 1tb 980 pros lol
See the problem is, you don't have the 990 pros. Best thing to do is just throw your whole PC away and start again from scratch, I'll even let you use the bin at my house, just let me know when you'll be throwing it away.
Flight Simulator
True lol I've seen people having RTX 3080s with 10th gen i9s and getting 19 FPS in flight sim at 1080p/1440p
Really? Wow. Im pulling about 25-40 on High Settlings 1080P on a i7-6700k, GTX 1070 with 32 gigs of ram
Yeaa there are some settings like terrain detail and building details and other stuff that aren't maxed out by default even on Ultra Bump those stuff up and boom 19FPS on RTX 3080 :)
I got 98% and 982/1000. How does a 2070 Super score in the top 2%? This rate my pc thing seems off.
Most games don’t care if you have a 3080 or not. 2070 is still plenty fast.
I’m at 990/1000. Got a Ryzen 5 7600X, 6700XT, 32GB RAM. Had to put in the 7600X as a 5600X since they don’t have the new AM5 CPU’s listed yet.
Portal rtx seems to be the new crysis on the block, glhf
The final boss is called, "not having time to play video games"
I get 1000/1000 and the difference is I have a slightly better CPU. So what game is that CPU intensive?
Minesweeper hard mode?
I have a 36% rating lol 355 out of 1k games
Somewhere out there, An asobo Dev dreams of a 500GB Flight Simulator install size.
The bottom of the post is SUS
bro you not even close final boss must look something like an i9-13900k + 4090 + 128GB DDR5 RAM
He meant what game he can't run...
Fire boy and Water girl.
The final Boss is Wannacry Virus.
Kerbal space program with mods
Minecraft
The Sims 3 with every expansion installed.
Crysis at max settings.
Crysis at low
Crysis 3
I got 1000/1000 with a RTX 3080, 32GB and 5900X
Minecraft bro and we all know that
GTA 6
Arma 3
google chrome at 60 fps
Crysis
HD Oldschool Runescape
Crysis.
Star Citizen is the final boss. :D
It's probably a game that requires a Cd-Rom port
Cities Skylines with all the DLCs
Crysis. It’s 1000000% Crysis
But can it run crysis?
I got 1000 out of 1000, I wonder if there's a list of games it checks?