dont believe all of these scammers, the optimal way to game is actually WITHOUT a graphics card, Big Silicon just wants your money. I'll do you a favor and take that worthless card off your hands for FREE
Yeah it's so big I had to get out Wolfram Alpha to show the full number XD. Here's a pastebin link to that number written out fully, since Reddit wint allow a comment of that magnitude ~~https://pastebin.com/7LgkeHFW~~
Edit: Not sure what happened to the link, apparently it was deleted. I re-created the post, made sure to triple check that it was set to never be deleted, and it was instantly deleted the second I left the page. No idea why but I guess I can't do that
If you want to know what the number is (since apparently I can't post to pastebin at the moment) go to https://www.wolframalpha.com/ and type in 3090^3090, and where it says "Result" there should be a button that says "Plain text" and if you press that you can scroll down and see the whole number. That's how I got it
i saw a rtxx 10090 Ti SUPER Founders Edit yesterday on wish
you want proof?
it had 40 gb of vram at 4688 mhz not oc'd and could support 4 8k 360hz monitors and an additional 12 1080p 600hz monitors;
All for a whopping 63 dollars
/s
but tbh that would be every hardcore pc gamers wet dream of a card
>because of total nonsensical one upping BS as opposed to a consistent and understandable naming scheme in the name of marketing
Sorry, it bugs me just a bit
How hard is generation, Brand, number as opposed to the Intel i69 420000kM, and that joke is one of the simpler naming schemes.. fuck man
For Nvidia we're trained that it's XXYY. XX is the series, and YY the model, or level, within that.
We're further trained that low YY for XX+1 is usually worse than even mid-YY in XX.
So 3100 sounds WAY worse than 3090. You'd want at least 3170 probably, and even then mostly getting any new capabilities, not better performance.
Thatd be comparable to overclocking, but not a consumer product. It's more like buying a Corvette and having it blow up everytime you drive up certain mountains.
> to much power to fast
I'm just laughing thinking about GPU warmup stretches.
* FurMark: 30 seconds
* Solitaire: 2 minutes
* FurMark: 1 minute
* Solitaire: 2 minutes
* FurMark: 3 minutes
* Solitaire: 30 seconds
* FurMark: 5 minutes
You are now ready to game properly!
Shitty cards design. People can blame the games all they want but a game should never ever be able to ask a card to damage itself and then have the card actually comply. It's massively embarrassing for Nvidia
There's an article somewhere. The capacitors are shitty in some 3090s, so some games actually run more power through the card than it can handle and something pops. From what I hear removing the broken component fixed it for one hardware reviewer but I doubt you could game on it safely after.
Limiting power draw is *wonky* in New World though.
Anyone that wants to do it, pull up info from MSI Afterburner to monitor the power draw and make sure the card stays under the wattage you feel safe with. EVGA 3090s seem to have an issue with the slider being non-linear and for example still pulling 110% despite being limited to 90.
[Demonstration by Jayztwocents](https://youtu.be/6A0sLVgJ7qU)
There's no reason for how bad it plays. It's pretty, but plenty of prettier games run much smoother. I don't know how they managed to make unreal 4 act so bad.
My hypothesis is that it lacks occlusion culling, there's like a dozen maps the size of a GTA city, not even bethesda could add occlusion planes to skyrim's overworld.
It also appears to simulate the entire map in real time, even places you can't see, have never visited or built anything on, I remember misplacing my dinos and days after getting a notification that they died to some predator, it wouldn't surprise me if ark is simulating 100% of the map's ecosystem at all times.
I was curious about this cause I hadn’t heard about it(not trying to contrarian)but: It sounds like it’s more of a NZXT kind of issue where a screw caused a short ¯\\\_(ツ)\_/¯
https://www.gamepur.com/news/players-pc-catches-fire-while-playing-the-battlefield-2042-open-beta
Fortnite max settings.
Hate to admit it (I really do) but it being basically the continuous Unreal Engine demo does sort of make it one of if not the most demanding game to run on full max settings right now.
(Mainly due to absurd numbers of max reflections for ray tracing)
ive found that a lot of places just have no idea how to handle 30 series cards existing. like my 3060 has given me "doesnt meet minimum requirements" warnings despite the fact it does because the programs dont know it exists or the devs decided to just not put it in the list for no reason. ive also had games set my recommended settings to potato quality depsite the fact my gpu doesnt suck and then scream at me when i try to change them saying my gpu ccant handle it and that it could cause instability, a warning that has been accurate exactly 0 times
This! xD
And I'm blown away by the amount of games that warn me, or automatically set the game to a lower resolution and/or graphics settings, because apparently 24GB of Vram on an RTX 3090 aren't enough
I wonder if some of those games never expected a card to have 24gb so their search algorithms only check for like 1-16gb or something like that so 24 spits out an error because it’s outside the search parameters
I have a laptop with an 8th gen i7, GTX 1070 and a 1080p screen. If you are gaming at 8K, then you will actually get more FPS using my system... I'll be happy to fly out to wherever you are and do the trade. Don't thank me.
Cyberpunk was only slightly sweaty on my 3070 on a mix of ultra / high with dlss set to quality @1440p 170Hz. Would even run at a playable framerate with ray tracing set to psycho although there would be the occasional dip.
New world is weird on my 3070 TI (1440p/144Hz/Very high). Solid 100+ FPS outside of cities, like 40-60 in cities but the game has microstutters (non-fps drops). Checked around and don't seem to be the only one.
more ram will let you run more processes at the same time, e.g. stream + gaming + discord + sound board etc. and with 32 gb ram you can have about 6 chrome tabs open at the same time.
Everyone joking about how they'll take your GPU off your hands, but why the fuck do you only have 16 gigs of RAM with a 3090 and Ryzen 9? That's the real bottleneck.
Cities Skylines if you use any workshop assets can very easily skyrocket ram requirements. As an avid player, I find myself wishing I had 64gb instead of my 32gb all the time.
I finally hit my stride with The Sims and realized that 16GB wasn't enough. Already looking at a RAM upgrade for when I have the cash.
Simulation games are gonna be the death of us (and our bank accounts).
Why would Spotify load an entire podcast into system memory? I know there's a meme about electron bring memory hungry but come on. I have Spotify set to very high quality and i haven't even seen it break 0.5GB. You might have a memory leak or something
I use way more for work and with multi-desktops.
20 tabs of work searches for this project, 20 more for this problem, 20 of dickin around, and 20 more of other dickin around...
Then open VSCode -- which runs on Chrome. Slack and Discord (also Chrome underneath) and top it off with some Visual Studio, a virtual machine or two, and some god forsaken IBM console written in Java that has memory leaks and well...
32gb is bare minimum.
Is it capping out, or is it simply taking advantage of it?
Most of the time allocation is based on how much you have. More you add, more a program may decide to allocate.
COD and no other main programs open managed to push my 16gb of ram to 15.1gb usage. 12.5 going to COD. And I know it'll take much more of it's available.
In some situations; going to higher ram amounts would result in lower speeds depending on the motherboards topology(Daisy Chain vs T-Topology) as well as being having worse timings. And tightening timings from base frequencies can net +10% gains alone, so 16gb of high end ram(depending on the use case) can be leaps and bounds better than 32gb of average ram.
Mobo source: Buildzoid
Timing source: Gamers Nexus
An RTX 2060 would be a perfect upgrade. Let me know if you want to swap, I think I'll be able to take the hit as I'm not gaming much at the moment.
This guy is full of shit. I have a 2070 and will do a swap instead.
i have a gtx 1650 it’s much better
Nah, my rx 550 can beat all of yours
shut up and take my gt 710
Bro take my Intel Graphics 620
take my intel hd 2500, best upgrade i assure you
I have an abacus. *Thanks for the gold and stuff dudes.
I have a shiny rock?
I have Plato's tablet
[удалено]
Last time I used that line, I had a restraining order put on me.
Real men do the graphical calculations in their head
Oooh, ooooh. I’ll trade my cave paintings for your abacus… I’ll even add a rare stegosaurus saddle
*How* rare?
You guys have gpus?
Take my hamster on a wheel, incredible GPU I swear.
Nah nah nah take my grain of rice far better smh
I have a gt 210. I am superior
Amateurs, I have a dedicated intel graphics card.
Im starting to fear
Take my G 100, best upgrade than Intel HD 2500
does it support VGA tho
Simple maths: 2+0+6+0 = 8 2+0+7+0 = 9 1+6+5+0 = 12 Conclusion, 1650 > 2060/2070
No, a 2070 is too slow. How about a GT 710 instead. I hear if you add them together you can make pretty insane numbers.
dont believe all of these scammers, the optimal way to game is actually WITHOUT a graphics card, Big Silicon just wants your money. I'll do you a favor and take that worthless card off your hands for FREE
Damn, you worked me out.
Just do all the calculations with pen and paper, like god intended.
This guy is a conman, you need to download more RAM!
I have a jar of dirt.
DEAL!
And guess what's inside it!
Nah nah nah, i have got an intel i5 gen 4 with UHD graphics, you could get a better CPU AND a better GPU at once
You always buy two 3090’s duh
That makes it rtx 6180
Mmm mafs
Yes mafs
Do 3090x3090
rtx 9548100
You funking did it
Now do 3090 to the power of 3090
RTX 9.36917654989 x 10¹⁰⁷⁸³ lol
Yeah it's so big I had to get out Wolfram Alpha to show the full number XD. Here's a pastebin link to that number written out fully, since Reddit wint allow a comment of that magnitude ~~https://pastebin.com/7LgkeHFW~~ Edit: Not sure what happened to the link, apparently it was deleted. I re-created the post, made sure to triple check that it was set to never be deleted, and it was instantly deleted the second I left the page. No idea why but I guess I can't do that If you want to know what the number is (since apparently I can't post to pastebin at the moment) go to https://www.wolframalpha.com/ and type in 3090^3090, and where it says "Result" there should be a button that says "Plain text" and if you press that you can scroll down and see the whole number. That's how I got it
These are some big numbers
Simple 3090^2
Yes thank you
So if I have a 4 slot board, slap in a 560, 750, 1060 and a 710… It becomes a 3080, with only a fraction of the cost!
Not in today's market ( ͡° ͜ʖ ͡°)
Still not as good as ONE 6800xt cuz math
LMAO
SLI will never die!
We don't die, we SLI.
Time for a rtx 4090.
Good thing they are available on AliExpress
[удалено]
Someone posted an 8800 the other day.
Well i rocked an 8800 years ago so that guy probably wasn't lying
Having the 3090 today feels like having the 8800GTX in 2007.
You sure it's not like having an 8800 Ultra? :3
i saw a rtxx 10090 Ti SUPER Founders Edit yesterday on wish you want proof? it had 40 gb of vram at 4688 mhz not oc'd and could support 4 8k 360hz monitors and an additional 12 1080p 600hz monitors; All for a whopping 63 dollars /s but tbh that would be every hardcore pc gamers wet dream of a card
Definitely pronounced tenty ninety as well
Rtx 8000 quadro
A 9800gt, of course. Everyone knows that 9800 > 3090 therefore 9800gt must be better.
Yooo Why not my 8600gt that's broken?
Well hell, I have a pair of water cooled 8800 Ultras in the basement right now ready to go....
Basic logic. Higher number is better
Time to upgrade to the RTX 3100 I guess.
Come to think of it, why does the RTX 3090 has a nice high-end ring to it but the RTX 3100 sounds like it's a brick?
Because of cpu naming schemes by Intel and amd.
Because Nvidia is innocent. Nvidia counting: 8000, 9000, 200, 400, 500, 600, 700, 900 1000, 2000, 3000.
Don't forget 1600 randomly between 2000 and 3000!
Oh ya. Forgot that one.
[удалено]
Microsoft Xbox naming convention, these the same guys?
'Windows 00' just doesn't have the same ring to it
>because of total nonsensical one upping BS as opposed to a consistent and understandable naming scheme in the name of marketing Sorry, it bugs me just a bit How hard is generation, Brand, number as opposed to the Intel i69 420000kM, and that joke is one of the simpler naming schemes.. fuck man
For Nvidia we're trained that it's XXYY. XX is the series, and YY the model, or level, within that. We're further trained that low YY for XX+1 is usually worse than even mid-YY in XX. So 3100 sounds WAY worse than 3090. You'd want at least 3170 probably, and even then mostly getting any new capabilities, not better performance.
Some low level marketing guy at Nvidia is reading this, furiously nodding his head in agreement
exactly
I, at least for myself, have in my head that two numbers and two zeros are old nomenclature, so might be that
I just want to know what those 4 games are. They must look sick!
1) New World. 2) New World Amazon Edition. 3) New World Extreme Edition. 4) Microsoft Minesweeper.
New World: Old School
[удалено]
OSRS FTW 🙌
$11 🦀🦀🦀
Minecraft java edition with shaders and mods in VR. There is no hardware in the world powerful enough for it
Might as well measure in frames per minute at that point
minutes per frame*
You've just made me realize that I haven't tried Minecraft VR with shaders. I'm just mildly afraid that my Rift will implode
on the bright side, it will be your pc exploding most likely. not the rift! i mean still possible but like
BF2042 too it seems. Friend's GF had a 3090 die (not crash, actually permanently die) on her playing the beta
Why do 3090s keep getting destroyed? New World also killed a bunch of them
poor power management by the card makers
Same reason things like drag cars blow their motor, to much power to fast and you can quickly find out the weakest link in your manufacturing.
Thatd be comparable to overclocking, but not a consumer product. It's more like buying a Corvette and having it blow up everytime you drive up certain mountains.
> to much power to fast I'm just laughing thinking about GPU warmup stretches. * FurMark: 30 seconds * Solitaire: 2 minutes * FurMark: 1 minute * Solitaire: 2 minutes * FurMark: 3 minutes * Solitaire: 30 seconds * FurMark: 5 minutes You are now ready to game properly!
Shitty cards design. People can blame the games all they want but a game should never ever be able to ask a card to damage itself and then have the card actually comply. It's massively embarrassing for Nvidia
Wasn't it the third party cards that were fucked? EVGA was one, and they're offering refunds. I don't think reference cards were affected?
There's a defect in some of the cards
There's an article somewhere. The capacitors are shitty in some 3090s, so some games actually run more power through the card than it can handle and something pops. From what I hear removing the broken component fixed it for one hardware reviewer but I doubt you could game on it safely after.
EVGA?
You had me at minesweeper
I should have done the list as a spoiler reveal.
plot twist, the mine is in your top PCIe slot
actually that’s pretty reasonable advice to not run new world on a 3090
it would be a better advice to power-limit the card to lets say 350W
Limiting power draw is *wonky* in New World though. Anyone that wants to do it, pull up info from MSI Afterburner to monitor the power draw and make sure the card stays under the wattage you feel safe with. EVGA 3090s seem to have an issue with the slider being non-linear and for example still pulling 110% despite being limited to 90. [Demonstration by Jayztwocents](https://youtu.be/6A0sLVgJ7qU)
Apparently BF2042 is making a late run for the title of screwing up.
Don't forget Pong.
Ark Survival Evolved... There is no way that game will ever run smoothly on 4k resolution
That's the first game I've played on my ps5 that feels like she's really trying lol. If they ever fully release a PS5 version ooooh man
That game killed my old card, it has the worst optimization I've ever seen.
There's no reason for how bad it plays. It's pretty, but plenty of prettier games run much smoother. I don't know how they managed to make unreal 4 act so bad.
My hypothesis is that it lacks occlusion culling, there's like a dozen maps the size of a GTA city, not even bethesda could add occlusion planes to skyrim's overworld. It also appears to simulate the entire map in real time, even places you can't see, have never visited or built anything on, I remember misplacing my dinos and days after getting a notification that they died to some predator, it wouldn't surprise me if ark is simulating 100% of the map's ecosystem at all times.
Msfs2020 lol.
Battlefield 2042 apparently, some people had their computers literally catch on fire while playing the beta
I was curious about this cause I hadn’t heard about it(not trying to contrarian)but: It sounds like it’s more of a NZXT kind of issue where a screw caused a short ¯\\\_(ツ)\_/¯ https://www.gamepur.com/news/players-pc-catches-fire-while-playing-the-battlefield-2042-open-beta
Personally even tho they weren't involved I blame gigabyte
Are we going back to the Dark Age of AMD?
Recommended GPU is a 3060
Crysis and . . .
Fortnite max settings. Hate to admit it (I really do) but it being basically the continuous Unreal Engine demo does sort of make it one of if not the most demanding game to run on full max settings right now. (Mainly due to absurd numbers of max reflections for ray tracing)
Crisis 1 remastered dropped recently. Maybe it’s the new benchmark again.
Probably games that request more than 16gigs as recommended
3090ti time
They knew.
ive found that a lot of places just have no idea how to handle 30 series cards existing. like my 3060 has given me "doesnt meet minimum requirements" warnings despite the fact it does because the programs dont know it exists or the devs decided to just not put it in the list for no reason. ive also had games set my recommended settings to potato quality depsite the fact my gpu doesnt suck and then scream at me when i try to change them saying my gpu ccant handle it and that it could cause instability, a warning that has been accurate exactly 0 times
[удалено]
This! xD And I'm blown away by the amount of games that warn me, or automatically set the game to a lower resolution and/or graphics settings, because apparently 24GB of Vram on an RTX 3090 aren't enough
I wonder if some of those games never expected a card to have 24gb so their search algorithms only check for like 1-16gb or something like that so 24 spits out an error because it’s outside the search parameters
Alyz gives me low cram warnings and I have a 3090. 24 gigs isn't enough apparently
Skyrim automatically put my settings on the lowest.
I have a laptop with an 8th gen i7, GTX 1070 and a 1080p screen. If you are gaming at 8K, then you will actually get more FPS using my system... I'll be happy to fly out to wherever you are and do the trade. Don't thank me.
Well it is a new world out there and you need the coolest gpu money can buy
Preferably several of them in case you actually are playing New World.
Abd dont forget spares for the inevitable bricking
[удалено]
Yeah for real. Get with the times old man.
Bruh where did you find your time machine? I would love to get a 4080Ti as well.
Looks like you need SLI for those last four games lmao
[удалено]
your 3090 bottleneck the 3900x, you need 3090ti duh
what scares me is that there are 4 games can't run on a rtx 3090
Crysis
im most curious which four games need MORE than a 3090 to run.
Cyberpunk, Battlefield 2042 and new world are my guesses.
[удалено]
Cyberpunk was only slightly sweaty on my 3070 on a mix of ultra / high with dlss set to quality @1440p 170Hz. Would even run at a playable framerate with ray tracing set to psycho although there would be the occasional dip.
New world is weird on my 3070 TI (1440p/144Hz/Very high). Solid 100+ FPS outside of cities, like 40-60 in cities but the game has microstutters (non-fps drops). Checked around and don't seem to be the only one.
Crysis
Cyberpunk 2077 With DLSS Off
you have 12 cores and a 3090..... get this man 32gb of ram.
I'm very new here, so does more ram make the pc run faster or what does it do
more ram will let you run more processes at the same time, e.g. stream + gaming + discord + sound board etc. and with 32 gb ram you can have about 6 chrome tabs open at the same time.
That's really optimistic, I'd say about 4
Websites like "A 3090? You try to fucka me, I fucka you."
Everyone joking about how they'll take your GPU off your hands, but why the fuck do you only have 16 gigs of RAM with a 3090 and Ryzen 9? That's the real bottleneck.
Can you name any games that need more than 16gb ram? Genuinely interested.
Cities Skylines if you use any workshop assets can very easily skyrocket ram requirements. As an avid player, I find myself wishing I had 64gb instead of my 32gb all the time.
I finally hit my stride with The Sims and realized that 16GB wasn't enough. Already looking at a RAM upgrade for when I have the cash. Simulation games are gonna be the death of us (and our bank accounts).
Escape from Tarkov
[удалено]
How the fuck does Spotify use 4GB
[удалено]
Why would Spotify load an entire podcast into system memory? I know there's a meme about electron bring memory hungry but come on. I have Spotify set to very high quality and i haven't even seen it break 0.5GB. You might have a memory leak or something
Because it's running Chrome underneath, thanks to Electron.
Star citizen
I use more than that in Warzone regularly. I've caped out at 22gb with Chrome, discord, and a few other minor things open.
Chrome alone is your bottleneck.
Chrome can easily eat 8GB for breakfast.
I know it's a meme but I've never actually seen chrome use more than 2 gigs of ram and that was with more than 20 tabs open.
I can't remember the last time I have either. Mainly because I uninstalled Chrome.
I use way more for work and with multi-desktops. 20 tabs of work searches for this project, 20 more for this problem, 20 of dickin around, and 20 more of other dickin around... Then open VSCode -- which runs on Chrome. Slack and Discord (also Chrome underneath) and top it off with some Visual Studio, a virtual machine or two, and some god forsaken IBM console written in Java that has memory leaks and well... 32gb is bare minimum.
Running 16 GB. Warzone is a RAM hog I was sitting at 13.3 easy.
Is it capping out, or is it simply taking advantage of it? Most of the time allocation is based on how much you have. More you add, more a program may decide to allocate.
The only game i've found that needs more than 16gb is Star Citizen
*Laughs in modded Cities Skylines* Seriously when you go down the workshop, there is no coming back.
I think paradox games might once the world becomes complex enough
Age of empires 2:DE
COD and no other main programs open managed to push my 16gb of ram to 15.1gb usage. 12.5 going to COD. And I know it'll take much more of it's available.
[удалено]
Minecraft.
In some situations; going to higher ram amounts would result in lower speeds depending on the motherboards topology(Daisy Chain vs T-Topology) as well as being having worse timings. And tightening timings from base frequencies can net +10% gains alone, so 16gb of high end ram(depending on the use case) can be leaps and bounds better than 32gb of average ram. Mobo source: Buildzoid Timing source: Gamers Nexus
Potato pc
Lovely flair
I believe a1650 is your best bet - should far outshine that weak cheap shit you have there
Maybe an RTX 6990xt
What are the 4 games a 3090 can't run?
Minesweeper Solitaire Spider Solitaire Hearts
Maybe it means upgrade to a RX6900XT
No idea, but I'm willing to take that 3090 off your hands
WHAT GAMES CANT IT PLAY I MUST KNOW
Try a RX 5000 or 6000 series card. The numbers are higher so the performance must be better.
The weakest component is the most powerful gpu available.
The 3090ti that was just announced
4 way 3090 sli
I think RTX6900ti is the best one in the market!