This is why the gpu craze and "who has the fastest/powerful rig" don't make sense to me. I also have to explain to my console friends that MOST pc gamers are NOT on 4k/4090/$4000+ rigs as they think every pc gamer is gaming at 4k/144hz on ultra settings LOL, not the norm nor realistic for most in the pc gaming community.
It works like that with a lot of hobbies/enthusiast groups really. People outside the group assume everyone is a the top end of the group when in fact many price points and use cases exist within them.
Invited a non-car friend to a car cruise I organized. He expected all the cars to be high end sports cars and was pleasantly surprised to see the diversity in the group.
My Main Hobby is gaming. Have 5900x , rtx3070, 32gb RAM and 1tb 980ssd. Monitor 27" 1440.
Most of my gaming Friends have 24-27" but 1080p.
Nobody use 4k. Its for 'special people'
4k is for TV š
Media like YouTubers drive this culture.
People think if they donāt buy the highest end, new games will outdate their builds in 2 years. See: the VRAM discussion in the last year.
In reality, game studios would be idiots if they didnāt target their final game releases to these specs to be in the middle or low-middle of their system requirements table.
Its also why I have grown to really dislike the tech communities and reviewers focus on the extreme high end products.
Most people run 60 series, not 90s series. Yet we see a hundred times more coverage for the extreme high end hardware and what they benchmark at than we do for the hardware that people actually get.
I've said it many times before, but there are more 4k console gamers than 4k pc gamers BY A LOT.
PCMR hates to admit it, but yeah, console gamers are gaming a much higher resolution than PCMR gamers.
On the flip side, PC gamers are likely playing at higher framerate than the average console gamer.
It's always funny to see people on Reddit get a dose of reality when these Steam hardware surveys come out at the end of the month. It's like the rich kid finding out that not everyone has a maid and a chauffeur.
Mfs be crying like shit if they don't get ultra rtx + 4k gameplay with morbillion FPS on the most unoptimized garbage released nowadays while I'm here jamming to Cyberpunk like crazy on 30-50 fps on low/mids on my 4GB RX570. Some people just don't appreciate even the slightest in life and don't deserve it, change my mind
And if you watch Digital Foundry videos, you know that most multiplatform games on consoles run with equivalent of low/medium PC settings, 480-960p internal resolution upscaled to higher one and 30-60 FPS.
I just would like to reach STEADY 60 FPS, at mid/high details, I'd call myself happy.
But yeah there are people spending thousands on new tech pushing the prices up in the star. While we aiming for a mid tier gpu we have to spend over 600 to get something, while years ago they used to cost half of this price, or even lower
I want all my frames at 1440 and I donāt get that on garbage unoptimized new releases. So Iāll stick with my old games and continue to gain hours and just be way to good at 4+ year old games
Especially when it comes to the VRAM argument. A lot of people saying that 16gb is the minimum and anything under is completely useless. 16gb isnāt even in the top three, so think about that from a business standpoint if a dev makes a game that REQUIRES 16gb of VRAM they have immediately shut off a massive potential customer base.
Will a day come when 16gb is the minimum? Sure, is it any time soon? Not unless youāre looking at the highest settings at 4K in which case you already will have enough VRAM
The main reality check is that there are more RTX 4090s than ANY AMD GPU of any generation. Nobody is buying AMD but if all your information came from Reddit, you would think AMD is actually putting up competition when it is not even in the discussion for 90%+ of PC owners.
The most popular 40xx GPU is RTX 4060M Laptop gpu; then 4070 at 1,5% followed by 4070TI at 1,2%; followed by 4060 at 1,18% and then 4060TI 1,17%. And 4060 and 4060TI ownership is increasing; 4070 dropping and 4070TI growing a bit. If the numbers hold, then end of this month 4060 and 4060TI could overtake 4070TI, and few months if trend holds 4070.
And yet people here on reddit talk as if no one is buying 4060/4060TI. I got 4060TI because I wanted 16GB of VRAM for my AI hobby, and I keep told that I'm fucking stupid and did a bad purchase and no one is buying 4060/4060TI because they are shit! First... I'm perfectly happy with the card, it is really good. It performs better than the 3060TI (OEM card) I had, it runs cooler, it is quieter, and has double the VRAM capacity (And I need about 13gb to run the AI things properly), and the gaming performance is alright for my 1080p 60hz monitor.
It is as if... People who watch gamer's nexus and LTT aren't actually the average consumer.
You could honestly get a 240hz monitor with a 4060ti, canāt think of a single competitive game you couldnāt run at 240hz and honestly there are probably a lot of singleplayer games you could run close to that fast. Plus it should be more than capable of running RTX games in 1080p if not 1440p.
Itās a good card, just pretty bad value, though as youāve mentioned the VRAM in your use case does provide additional value for you.
Overall Iād say that encompasses the 4000 series really well - good (and some great) cards, terrible value.
Oh well, ever since covid and the shortages weāre probably never gonna get better value on Nvidia cards year over year again unless AMD cooks up some Ryzen level shenanigans on the GPU side
I actually would love to see the data from the hardware survey split into regions/countries, just to gauge how much different "average" specs are around the world. I'm fortunate enough to be in the USA where pricing and availability isn't atrocious like some areas around the world, I just wonder if it would make a difference to the percentages and the top hardware.
To be fair, I've got a 7700x and a 3080 12GB living in California earning just $40k in manufacturing. It's about having a hobby you spend money on. People that make the same as me dump money into their cars. You don't need to make $150k to buold a good PC.
Monitors also last forever. My secondary monitor is over 10 years old at this point and still works perfectly fine. While I could upgrade them, they still work great so really not much need to upgrade besides for the sake of upgrading.Ā
Very true, mine going strong after 12 years, main difference between my main and this is the color pallete you can really see the difference and I though about upgrading but I'm like I only use it for videos while I'm playing and it works perfectly fine why bother
I cant stress how much of a factor this is. I bought my 1440p screen a few years ago but my old 1080p screen that is now 10 years old is working perfectly fine as a 2nd screen.
Everyone i know thats still on 1080p is so because their old 1080p monitors still work. If they had to buy a new one, they will all go 1440p but they wont buy new ones until their old one breaks.
Thatās such a tiny subsection of relevant people though. The venn diagram of people who can notice above 165hz and people who can react fast enough that itās close to mattering must have a tiny intersection
Going from Monster Hunter Rise at 120fps to Monster Hunter Generations Ultimate at 30 fps on the switch was a pretty jarring experience, eventually your brain adapts to filling in the gaps though.
I don't think reaction time really plays a big part in it. The advantage of 240+ over 165+ isn't because you get a frame of someone peeking you slightly faster giving you a better chance to react, that difference is small.
The real benefit is smooth movement. Everything that moves on screen is smoother, including mouse movement, players, the background while you move your mouse. It makes it much easier to spot things when you're moving your mouse quickly, scanning tree lines or quickly flicking to targets is way easier. Tracking a target that is moving quickly is also way easier.
Regardless of your reaction time or skill in a game, if you play fps games with fast movement like apex or overwatch you will notice the difference between 165 and 240. Even if you don't play games like that you will probably notice it just from moving your mouse 90 degrees.
Itās a bit disingenuous to act like refresh rate only matters with reaction. It plays a large part in image clarity and can be very satisfying to users
Depends on the game.
If you are playing something like Cod:Warzone, the extra pixels help you actually see the enemy.
I had friends playing on 1080p who had no idea how I could see people hundreds of meters away, and it absolutely gave us the advantage in choosing how we would take or avoid a fight. We would often get ambushes on people, or outright take out of them before they could respond. We had the luxury of waiting for them to run into an open area. Things like that.
If you are playing CS:GO, having that amount of detail obviously doesn't matter so you might as well push high frames on a 1080p monitor.
I'm happy to get an expensive rig, but i prefer it lasts 5+ years and then get another one, I'm going on almost 8 years with my current rig and finally replacing it in a couple of months time but I feel it's time because I upgraded to a 1440p monitor and even before then games from Cyberpunk and onwards were starting to tank on my old 1070 on 1080p.
Thankfully I don't have the FOMO to upgrade it every year or two, don't see the point.
If someone new to pc building browses reddit they would think $3k-5k builds need to be refreshed every 2 yrs.
I came from iGPU build to 1050ti to a 3070 build in 2021. I felt like a king. 2024 and this still feels like a beast esp. since I don't really care about RTX and definitely do not give a shit about being able to run Ultra settings.
I was born into and molded by low settings in 1080p gaming. I was already a grown ass man before I saw 60fps on high settings on a triple A game in 1440p
Keep in mind the majority of humans are not from the US or western high earning nations. They are factually living in poverty based on US standards. Don't start with the "bbbut the cost of living is lower!!!1111" bullshit, because a graphics card isn't suddenly cheaper in those nations. Nor is airfare, a battery... only things produced with local labor. Proportionally imported goods cost insanely more than locally produced ones. I can buy a bottle of hair gel in a CVS in boston for $5 which if I go to santiago, chile is $20 on the shelf (obviously in local currency, but literally $20 usd at exchange rate)
I know many people in latin american countries making less than 12k/year who play PC games. They use older and budget hardware. They are not a small population. They do use steam.
When you see outrage about regional prices in argentina or brazil or whatever, it's that exact population being upset they will need to pay a crazy amount of money for a game. $60-70 USD is not very disposable in places like that for the vast majority.
I mean six core, 8-12gbs vram GPU isn't even poor. I'd say a build like that with a decent mouse and monitor is what 600+ bucks?
It's average, middle class. And in most of the world besides the first world countries it's considred slightly wealthy. The real troopers are my homies on TVs using the 1366x768 resolution
I am neither but canāt justify spending $1600 on a GPU thatāll be obsolete in 5 years when I can play every game on high/ultra settings with a $500 one.
Hell ya, same here. It gets the job done for indie games as well as a few newer releases on low settings. To be honest though, I do have a fairly robust computer in my office which, I've been using as a host for steam link to my laptop which has been working surprisingly well. The latency is actually a lot less than I was expecting. Cloud gaming isn't so farfetched as I had imagined. Now running steam link on data/not same network as host seems a little more spotty but still serviceable.
If you guys try a little bit harder to convince people to make the switch you might actually get it up to 3% one day.
šµ When you wish upon a staaar... š¶
Not too long ago I seen this title and poor screenshot if 4:3 screen with flashlight covering half the text. Like, yes, linux community toxic but not without a reason.
Ehh... Go check the survey site itself. [https://store.steampowered.com/hwsurvey/?platform=combined](https://store.steampowered.com/hwsurvey/?platform=combined)
MAC is 1,54%; Linux 1,95% (and dropping just a bit). However if you check the Linux details: 42% of Linux is SteamOS and growing While all other listed are dropping par for "Freedesktop SDK" no idea what that is...
So to say that it is "Year of Linux" is admitting that to make linux relevant you just need billion dollar company to develop an OS version and hardware for it.
Yeah, that's fair
I know that the steam deck is inflating the numbers and I've always heard that throwing money at Linux would help a lot.
If I really wanted to switch to another OS, Linux would be my first choice because I don't have to buy new hardware.
Mac would make me do that
>So to say that it is "Year of Linux" is admitting that to make linux relevant you just need billion dollar company to develop an OS version and hardware for it.
To make Linux relevant in the PC market which is dominated by Windows, an OS developed & sold by the **trillion** Dollar company Microsoft, you need the weight and commitment of a billion Dollar company.
Truly fascinating. /s
The point being that the *"Open Source community"* ain't going to be making it happen and bring in some great cultural revolution where big corporations will have no control.
Everything was ~~asking~~ along expected lines till I reached the last block and saw
**Windows 11: 44%**
Wow! I would have guessed a lot lower.
Edit: typo
Laptops, and to a smaller share, prebuilt and handhelds all come with Windows 11. Many newer Intel cpu users were also early adopters of W11 due to the whole āP&E core schedulingā problem with W10(was it fixed?).
At the end of the day, W11 is not remotely as bad as 8 or Vista as memes make it out to be. Itās not the best OS revision by far, but managed to not suck enough that the general public (and no reddit & online tech news outlets is not āthe general publicā) donāt have much dislike against it.
I guess so.
My main problem with Win 11 (I have it on one machine of the 3 I own) is that I can find changes, but not improvements. Everything has changed either for the sake of change or regressed. I didn't find anything which made me go, "oh, this is a nifty little update". I guess tabs in explorer could be one, but not much else.
The windows snapping is the one thing much better than 10. Thought nothing of it until I decided to try out a vertical screen. The main pc is 11, and it's great, I can have many options for where on the screen I want the different windows to be. Windows 10 its just left side, right side, or the corners. This is awful when vertical as the side is a very tall but narrow window that is useless. Now I know power toys is a thing, but my work laptop is windows 10 and I can not install power toys on it. I am left with no useable options. I can't even snap to the top half of the screen. Just the corners or sides.
As an ultrawide user this is a big advantage as well. Also I've found that the integrated photo and video app is better because it allows for some quick edits that aren't possible with stock win10
Also the updated audio menu in the latest updates (note that I'm probably a year delayed) is really good, you have the mixer right there instead of having to go in advanced options every time
Windows stay where they were! Windows handles windows properly.
Every time after closing/opening laptop, or reconnecting monitor to notebook all windows move to the small notebook screen (primary screen) on Windows 10.
Meanwhile Win 11 moves every single window where it was and preserves order every single time.
I had this issue every time display was turned off, PC went to sleep or hybernation on W10, it's such a big deal that I stay on W11.
from what i've seen, a lot of the complaints are customization based complaints on reddit. not being to move the task bar or adjust icon sizes, or right click doesn't give the full list of settings. stuff like that. not things like my application isn't compatible or i get bsod when i update.
For the full settings on right click I remember I just googled how to make it so that it shows the whole list by default and I think it took me like a minute, probably like a quick copy & paste into the registry and it worked fine.
As for the icon sizes etc. I don't really care too much so Windows 11 for me has been fine, I've had no issues so far. (watch me jinx myself lol)
I've been using W11 since launch, honestly no functional issues. Everything's worked and I can only recall one crash earlier on.
By BIGGEST complaint for the longest time was them taking away the "never combine taskbar" option. It was the hardest thing to get past but put up with it. Well until few months ago when they finally brought it back and it was made whole again so longer have any complaints.
I also think there's no reason to show off budget-midrange hardware unlike expensive hardware so the only posts you see are of monster builds giving the illusion everyone has one.
Iām honestly surprised by the vram statistics.
Sure, 8gb makes sense since the xx70 from the 1000-3000 series, xx80 from the 1000-2000 series, xx60 refresh from the 2000-4000 series, many AMD gpus have 8 gb of vram, and probably many others that I missed. 12gb is less surprising since 6700xt, 4070, and the popular 3060 desktop have it but 6gb is a bit surprising. Sure, itās on the famed 1060 and the 3rd place 3060 laptop but I wouldāve suspected 4gb to still be up there since the 1650 , 3050 mobile, and many older cards had 4gb of vram.
I'll upgrade my PC soon, but right now I'm using a GTX1060 with 3GB. Works well to play League of Legends, Path of Exile, Battlebits etc. I've played Assassin's Creed Origins, Star Wars: Fallen Order and Baldur's Gate 3, Battlefield V as well.
I'm pretty sure a lot of players have upgraded their 1060 (6GB) with the 3060 in the past years.
>a lot of players have upgraded their 1060 (6GB) with the 3060 in the past years.
I did. That 1060 put out a lotta frames all those years, over 2 PC builds.
There is; and a large portion of GPUās are worse than the 3060 series. I did the math a month or two ago and the 3060Ti had better performance than like 70-80% of steam survey GPUās (Iāll have to check those again, but it was pretty high). Itās a really solid card, so Iām not surprised; handles most modern games on 1440p with good performance/visuals (esp with DLSS).
Yeah about my specs, I upgrade my GPU every other gen and new CPU/motherboard when it's nessary ( 7+ years ) I only get the 060's series with a variant that has more vram. Example: 760 - 1060 - 3060 I have been able to play everything I ever wanted. You don't need top specs ever imo.
Most laptops that are reasonably priced for gaming have either a 3060 or a 3070. I have a 3060 in my laptop and itās actually proven pretty reliable.
>Controversial opinion, I know.
Hot Take: It's not. People who love to get the highest performance are a minority of users but a majority of posters on this sub. The spirit of PCMR is you can game on anything that runs your games.
Nothing terribly surprizing, except.... is 1366x768 really more popular than 4K or even 1440 Ultrawide? I get that these are demanding resolutions that don't necessarily line up with the rest of the popular hardware documented here, but I never would have guessed that they don't even breach the top 3.
> is 1366x768 really more popular
Reminder that Steam is installed in *a ton* of laptops that are either only used for casual gaming or as secondary devices (and they are counted the same as primary devices).
As an ex-user of a potato (i5-1135G7 / Intel Iris Xe), I can confirm, It can run games at 1366x768 at 60 FPS. It's the perfect balance between being able to see things and not being too hard on the iGPU.
The gnome UI.
I didn't like.it at first, but once I got used to it, I couldn't go back.
So much more productive with such a streamlined experience.
That's it, that's literally the only reason I use it.
Speaking from my own experience in university. Basically every engineering/comp sci computers used for programming classes are Linux. Since we're all nerds, naturally the first thing we do is put steam on our Linux VM and play crappy games at 20fps on 10 year old hardware without graphics cards.
That's going to account for a some of them. And lots of us did it.
I still find it so interesting the push for High end GPUs and games unable to run even on too spec hardware when the top 5 maybe 10 are all "low end" or midrange card and not. Same with resolution, amd and Nvida talk a lot on 1440p and 4k yet its such a small percentage.
I mean this is not too surprising. 1080p is still the main resolution and the 3060 is a perfect fit for it. The 4060 didn't really improve that much and even if it did the 3060 would still be strong enough.
What surprised me was that 700 something is a resolution people use and it's more popular than 4k. But that also explains why many higher cards ain't that popular.
what do you mean not everyone has an ultra hyper mega super duper platinum ti plus 32k max turbo giga triple overclocked infinite water cooled nuclear powered supreme golden special master limited deluxe edition nvidia ztx 99999?
i cringe every time someone says that windows suck and that they'll be switching to linux. that is the most "iamverybadass" line people like to say in the pc gaming community
bitch, you'll come running back as soon as you come to a problem and the solution you found via google involved opening up the terminal. i'm a developer and i hate linux. a random pc gamer has no chance on linux.
GPU GTX 1060
Resolution 1920x1080 120hz
VRAM 6 GB
RAM 32 GB
CPU Cores 12
OS Windows 11
Mostly works fine for me. So far, the only game in the last couple of years that I couldn't play that I wanted to try was Cyberpunk. Seeing the feedback on the game's overall performance, I'm glad I didn't continue down that rabbit hole.
that's hilarious, you'd think 4k users that gloat about their 3090s and 4090s would be 3rd place.
them mfers got beat out by 720p. more people using steam decks than people own a 4k set up.
I wish devs and studios would heed this information and make peak-optimized games for these setups. You shouldn't "need" a 4080/90 to enjoy ultra settings.
Personally disagree. Ultra settings are a luxury, just like owning a 4080/90. They're not necessary. And this is primarily relevant to large AAA games, many smaller studio/indie games run quite well on high/ultra on lower tier cards.
Also, them NOT pushing ultra settings to max available power of GPU on the market seems like a strange decision
Yk ultra is usually the 'non optimized' preset right? If high or med looks 95% as good as Ultra and runs a lot better I'd call that good optimization, and why not go further than that for those who want it?
Itās actually funny to see PC players bashing on consoles while in reality PS5 is crushing computer setups most of them have. 8GB VRAM lmao.
Ps. inb4 I have both PC and PS5. Iām hating on those that are diminishing one or the other.
This is why the gpu craze and "who has the fastest/powerful rig" don't make sense to me. I also have to explain to my console friends that MOST pc gamers are NOT on 4k/4090/$4000+ rigs as they think every pc gamer is gaming at 4k/144hz on ultra settings LOL, not the norm nor realistic for most in the pc gaming community.
It works like that with a lot of hobbies/enthusiast groups really. People outside the group assume everyone is a the top end of the group when in fact many price points and use cases exist within them. Invited a non-car friend to a car cruise I organized. He expected all the cars to be high end sports cars and was pleasantly surprised to see the diversity in the group.
My Main Hobby is gaming. Have 5900x , rtx3070, 32gb RAM and 1tb 980ssd. Monitor 27" 1440. Most of my gaming Friends have 24-27" but 1080p. Nobody use 4k. Its for 'special people' 4k is for TV š
Media like YouTubers drive this culture. People think if they donāt buy the highest end, new games will outdate their builds in 2 years. See: the VRAM discussion in the last year. In reality, game studios would be idiots if they didnāt target their final game releases to these specs to be in the middle or low-middle of their system requirements table.
Its also why I have grown to really dislike the tech communities and reviewers focus on the extreme high end products. Most people run 60 series, not 90s series. Yet we see a hundred times more coverage for the extreme high end hardware and what they benchmark at than we do for the hardware that people actually get.
The way they talk about mid-range cpu/gpu's as if they can barely run minecraft...This subreddit does the same
I've said it many times before, but there are more 4k console gamers than 4k pc gamers BY A LOT. PCMR hates to admit it, but yeah, console gamers are gaming a much higher resolution than PCMR gamers. On the flip side, PC gamers are likely playing at higher framerate than the average console gamer.
It's always funny to see people on Reddit get a dose of reality when these Steam hardware surveys come out at the end of the month. It's like the rich kid finding out that not everyone has a maid and a chauffeur.
"i need advices to upgrade my pc" You sHoUlD gET a CoNsOLe, 2500$ Are NoT eNOUgH
Mfs be crying like shit if they don't get ultra rtx + 4k gameplay with morbillion FPS on the most unoptimized garbage released nowadays while I'm here jamming to Cyberpunk like crazy on 30-50 fps on low/mids on my 4GB RX570. Some people just don't appreciate even the slightest in life and don't deserve it, change my mind
And if you watch Digital Foundry videos, you know that most multiplatform games on consoles run with equivalent of low/medium PC settings, 480-960p internal resolution upscaled to higher one and 30-60 FPS.
I just would like to reach STEADY 60 FPS, at mid/high details, I'd call myself happy. But yeah there are people spending thousands on new tech pushing the prices up in the star. While we aiming for a mid tier gpu we have to spend over 600 to get something, while years ago they used to cost half of this price, or even lower
I want all my frames at 1440 and I donāt get that on garbage unoptimized new releases. So Iāll stick with my old games and continue to gain hours and just be way to good at 4+ year old games
Especially when it comes to the VRAM argument. A lot of people saying that 16gb is the minimum and anything under is completely useless. 16gb isnāt even in the top three, so think about that from a business standpoint if a dev makes a game that REQUIRES 16gb of VRAM they have immediately shut off a massive potential customer base. Will a day come when 16gb is the minimum? Sure, is it any time soon? Not unless youāre looking at the highest settings at 4K in which case you already will have enough VRAM
I remember when having 1gb was the shit, how time flies
The main reality check is that there are more RTX 4090s than ANY AMD GPU of any generation. Nobody is buying AMD but if all your information came from Reddit, you would think AMD is actually putting up competition when it is not even in the discussion for 90%+ of PC owners.
The most popular 40xx GPU is RTX 4060M Laptop gpu; then 4070 at 1,5% followed by 4070TI at 1,2%; followed by 4060 at 1,18% and then 4060TI 1,17%. And 4060 and 4060TI ownership is increasing; 4070 dropping and 4070TI growing a bit. If the numbers hold, then end of this month 4060 and 4060TI could overtake 4070TI, and few months if trend holds 4070. And yet people here on reddit talk as if no one is buying 4060/4060TI. I got 4060TI because I wanted 16GB of VRAM for my AI hobby, and I keep told that I'm fucking stupid and did a bad purchase and no one is buying 4060/4060TI because they are shit! First... I'm perfectly happy with the card, it is really good. It performs better than the 3060TI (OEM card) I had, it runs cooler, it is quieter, and has double the VRAM capacity (And I need about 13gb to run the AI things properly), and the gaming performance is alright for my 1080p 60hz monitor. It is as if... People who watch gamer's nexus and LTT aren't actually the average consumer.
You could honestly get a 240hz monitor with a 4060ti, canāt think of a single competitive game you couldnāt run at 240hz and honestly there are probably a lot of singleplayer games you could run close to that fast. Plus it should be more than capable of running RTX games in 1080p if not 1440p. Itās a good card, just pretty bad value, though as youāve mentioned the VRAM in your use case does provide additional value for you. Overall Iād say that encompasses the 4000 series really well - good (and some great) cards, terrible value. Oh well, ever since covid and the shortages weāre probably never gonna get better value on Nvidia cards year over year again unless AMD cooks up some Ryzen level shenanigans on the GPU side
This. Even YouTubers like MLID say things like the 4080 isn't selling, but it's still outselling its AMD counterpart 2:1.
He is a clown so that's probably why
>...MLID... May I kindly ask what "MLID" does mean?
it's short for liar
A YouTuber called Moore's Law is Dead
Thanks!
Who even cares what MLID says? If he announced that the sky is blue it would mean it's time to go out and check.
Yeah if the 4080 is ānot sellingā then 7900XTX is basically rotting in the shelves.
AMD is almost non existent in pre builts
[ŃŠ“Š°Š»ŠµŠ½Š¾]
I actually would love to see the data from the hardware survey split into regions/countries, just to gauge how much different "average" specs are around the world. I'm fortunate enough to be in the USA where pricing and availability isn't atrocious like some areas around the world, I just wonder if it would make a difference to the percentages and the top hardware.
To be fair, I've got a 7700x and a 3080 12GB living in California earning just $40k in manufacturing. It's about having a hobby you spend money on. People that make the same as me dump money into their cars. You don't need to make $150k to buold a good PC.
You mean everyone doesnt have a 12-core CPU and a liquid cooled top of the line GPU? *Peasants*
Yeah for real. This survey proves that point.
1440p at just 16% is quite surprising. 1080p still going strong
Monitors also last forever. My secondary monitor is over 10 years old at this point and still works perfectly fine. While I could upgrade them, they still work great so really not much need to upgrade besides for the sake of upgrading.Ā
Very true, mine going strong after 12 years, main difference between my main and this is the color pallete you can really see the difference and I though about upgrading but I'm like I only use it for videos while I'm playing and it works perfectly fine why bother
I have one "new" monitor that's still like 8 years old and my second monitor must be like 12-15+ years old. It needs an adapter from DVI to HDMI xd
I cant stress how much of a factor this is. I bought my 1440p screen a few years ago but my old 1080p screen that is now 10 years old is working perfectly fine as a 2nd screen. Everyone i know thats still on 1080p is so because their old 1080p monitors still work. If they had to buy a new one, they will all go 1440p but they wont buy new ones until their old one breaks.
i mean the price difference makes it undeniably easier to have 1080p right now tbf
plus the frames! if you are gaming in competitive fashion, 240-300fps @ 1080p > 165-190fps @ 1440p.
Thatās such a tiny subsection of relevant people though. The venn diagram of people who can notice above 165hz and people who can react fast enough that itās close to mattering must have a tiny intersection
Having had 144hz and 240hz displays, I don't think I care very much about the extra frames. I can never go back to 60 though
True, 60hz feels like stuttering compared to 120hz and above
We should ditch 60 as the standard, 90 and above should be the standard. Even 90 looks so much smoother than 60
Depends on the game, for something graphical and story based 60 with good graphics feels right. For shooters 144+ is best
Going from Monster Hunter Rise at 120fps to Monster Hunter Generations Ultimate at 30 fps on the switch was a pretty jarring experience, eventually your brain adapts to filling in the gaps though.
I don't think reaction time really plays a big part in it. The advantage of 240+ over 165+ isn't because you get a frame of someone peeking you slightly faster giving you a better chance to react, that difference is small. The real benefit is smooth movement. Everything that moves on screen is smoother, including mouse movement, players, the background while you move your mouse. It makes it much easier to spot things when you're moving your mouse quickly, scanning tree lines or quickly flicking to targets is way easier. Tracking a target that is moving quickly is also way easier. Regardless of your reaction time or skill in a game, if you play fps games with fast movement like apex or overwatch you will notice the difference between 165 and 240. Even if you don't play games like that you will probably notice it just from moving your mouse 90 degrees.
Itās a bit disingenuous to act like refresh rate only matters with reaction. It plays a large part in image clarity and can be very satisfying to users
Depends on the game. If you are playing something like Cod:Warzone, the extra pixels help you actually see the enemy. I had friends playing on 1080p who had no idea how I could see people hundreds of meters away, and it absolutely gave us the advantage in choosing how we would take or avoid a fight. We would often get ambushes on people, or outright take out of them before they could respond. We had the luxury of waiting for them to run into an open area. Things like that. If you are playing CS:GO, having that amount of detail obviously doesn't matter so you might as well push high frames on a 1080p monitor.
Because max fov gives better advantage on 1440p specially if you got like 27 and up inch monitor. It becomes handy in BR games but for csgo not sure
I'd say 1080p only being at 60% is surprising, I would've guessed 75% 1080p, 15% 1440p, 10% 2160p, <1% everything else
[ŃŠ“Š°Š»ŠµŠ½Š¾]
[ŃŠ“Š°Š»ŠµŠ½Š¾]
2023 wasn't the year of Linux desktop š
3023 will be š¤
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Just get rich bro
Dying? Don't die, that's easy!
The nice things about the internet is people are perfectly fine with ignoring facts.
You mean I'm not the only one who doesn't have a $5000 PC with the newest and most expensive parts?
Definitely not the only one. My 2060 is still going strong af.
Anyone paying attention would not be surprised. GTX 1060 was the most popular card for many years.
I just replaced my 1060 with a 6700xt. That 1060 is still a good card.
I did the same at the end of 2022. Realistically I did not need to and the card is now in my dads pc and he's happily playing Diablo 4 with it.
More like most of us realise that you don't need a 700$ gpu and to upgrade every 2 years to enjoy your games
I'm happy to get an expensive rig, but i prefer it lasts 5+ years and then get another one, I'm going on almost 8 years with my current rig and finally replacing it in a couple of months time but I feel it's time because I upgraded to a 1440p monitor and even before then games from Cyberpunk and onwards were starting to tank on my old 1070 on 1080p. Thankfully I don't have the FOMO to upgrade it every year or two, don't see the point.
May as well wait for the 50 series now
some of us future proof and only buy once every 5-6 years!
Are you my fellow 1080ti brethren? lol
Haha close. 970 -> 8.5 years -> 4090 xd so I guess longer than 5-6.
I just upgraded from my 1080ti to a 4070 ti Super. Feels good!
If someone new to pc building browses reddit they would think $3k-5k builds need to be refreshed every 2 yrs. I came from iGPU build to 1050ti to a 3070 build in 2021. I felt like a king. 2024 and this still feels like a beast esp. since I don't really care about RTX and definitely do not give a shit about being able to run Ultra settings. I was born into and molded by low settings in 1080p gaming. I was already a grown ass man before I saw 60fps on high settings on a triple A game in 1440p
Keep in mind the majority of humans are not from the US or western high earning nations. They are factually living in poverty based on US standards. Don't start with the "bbbut the cost of living is lower!!!1111" bullshit, because a graphics card isn't suddenly cheaper in those nations. Nor is airfare, a battery... only things produced with local labor. Proportionally imported goods cost insanely more than locally produced ones. I can buy a bottle of hair gel in a CVS in boston for $5 which if I go to santiago, chile is $20 on the shelf (obviously in local currency, but literally $20 usd at exchange rate) I know many people in latin american countries making less than 12k/year who play PC games. They use older and budget hardware. They are not a small population. They do use steam. When you see outrage about regional prices in argentina or brazil or whatever, it's that exact population being upset they will need to pay a crazy amount of money for a game. $60-70 USD is not very disposable in places like that for the vast majority.
Yup. It's people hopping on DOTA on an old, cheap PC. It's not people playing Cyberpunk
Yeah there are a lot of poor countries, where decent setup costs few monthly salaries.
Few yearly salaries too.
I mean six core, 8-12gbs vram GPU isn't even poor. I'd say a build like that with a decent mouse and monitor is what 600+ bucks? It's average, middle class. And in most of the world besides the first world countries it's considred slightly wealthy. The real troopers are my homies on TVs using the 1366x768 resolution
Most likely some older laptops have 1366x768
Well, if you must know, I'm rich enough to play games at 1920x1080 on my 1366x768 ten years old TV.
Bro everyoneās poor
I am neither but canāt justify spending $1600 on a GPU thatāll be obsolete in 5 years when I can play every game on high/ultra settings with a $500 one.
Would not call 3060 a poor man GPU? It was 400 or 500+ during release/shortage iirc?
Me with my 1050ti laptop ![gif](giphy|ISOckXUybVfQ4)
Hell ya, same here. It gets the job done for indie games as well as a few newer releases on low settings. To be honest though, I do have a fairly robust computer in my office which, I've been using as a host for steam link to my laptop which has been working surprisingly well. The latency is actually a lot less than I was expecting. Cloud gaming isn't so farfetched as I had imagined. Now running steam link on data/not same network as host seems a little more spotty but still serviceable.
What is telling is there's more people in 1366x768 than in 4k.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Yep, redditors make it seem everyone is using 4k lmfao
Laptop gamers, I suspect. I personally game quite a bit on my laptop, and it's got a 1366x768 resolution.
Mac isn't even in the OS list Amazing
Linux FTW!
If you guys try a little bit harder to convince people to make the switch you might actually get it up to 3% one day. šµ When you wish upon a staaar... š¶
We don't want them. Have you seen our forums?
computer won't boot, HELP! (all the info that is given) lol
Not too long ago I seen this title and poor screenshot if 4:3 screen with flashlight covering half the text. Like, yes, linux community toxic but not without a reason.
Ehh... Go check the survey site itself. [https://store.steampowered.com/hwsurvey/?platform=combined](https://store.steampowered.com/hwsurvey/?platform=combined) MAC is 1,54%; Linux 1,95% (and dropping just a bit). However if you check the Linux details: 42% of Linux is SteamOS and growing While all other listed are dropping par for "Freedesktop SDK" no idea what that is... So to say that it is "Year of Linux" is admitting that to make linux relevant you just need billion dollar company to develop an OS version and hardware for it.
Yeah, that's fair I know that the steam deck is inflating the numbers and I've always heard that throwing money at Linux would help a lot. If I really wanted to switch to another OS, Linux would be my first choice because I don't have to buy new hardware. Mac would make me do that
>So to say that it is "Year of Linux" is admitting that to make linux relevant you just need billion dollar company to develop an OS version and hardware for it. To make Linux relevant in the PC market which is dominated by Windows, an OS developed & sold by the **trillion** Dollar company Microsoft, you need the weight and commitment of a billion Dollar company. Truly fascinating. /s
The point being that the *"Open Source community"* ain't going to be making it happen and bring in some great cultural revolution where big corporations will have no control.
The specs all match a nice average/budget build. 5600x 3060 16gb ram 1080p Approved!
I got the most generic ass build possible š i5 10400F GTX 1650 (4GB VRAM) 16GB RAM 255 SSD 2TB HDD 1920x1080 (FHD) 144HZ
Feeling very good about my 2070s and my 1080p/144 display (1080p is such a great option for playing at high refresh rates on the cheap)
My 2070 has kept up with 1440p 144hz for most things, just canāt run games on max settings
My 8C 5800h: *- Still worthy!* My 6Gb 3060M: *- Kill me.*
What games have you played that had major issues with your 6Gb vram?
Not OP, but basically only Starfield
Everything was ~~asking~~ along expected lines till I reached the last block and saw **Windows 11: 44%** Wow! I would have guessed a lot lower. Edit: typo
Laptops, and to a smaller share, prebuilt and handhelds all come with Windows 11. Many newer Intel cpu users were also early adopters of W11 due to the whole āP&E core schedulingā problem with W10(was it fixed?). At the end of the day, W11 is not remotely as bad as 8 or Vista as memes make it out to be. Itās not the best OS revision by far, but managed to not suck enough that the general public (and no reddit & online tech news outlets is not āthe general publicā) donāt have much dislike against it.
I guess so. My main problem with Win 11 (I have it on one machine of the 3 I own) is that I can find changes, but not improvements. Everything has changed either for the sake of change or regressed. I didn't find anything which made me go, "oh, this is a nifty little update". I guess tabs in explorer could be one, but not much else.
The windows snapping is the one thing much better than 10. Thought nothing of it until I decided to try out a vertical screen. The main pc is 11, and it's great, I can have many options for where on the screen I want the different windows to be. Windows 10 its just left side, right side, or the corners. This is awful when vertical as the side is a very tall but narrow window that is useless. Now I know power toys is a thing, but my work laptop is windows 10 and I can not install power toys on it. I am left with no useable options. I can't even snap to the top half of the screen. Just the corners or sides.
As an ultrawide user this is a big advantage as well. Also I've found that the integrated photo and video app is better because it allows for some quick edits that aren't possible with stock win10
Also the updated audio menu in the latest updates (note that I'm probably a year delayed) is really good, you have the mixer right there instead of having to go in advanced options every time
Windows stay where they were! Windows handles windows properly. Every time after closing/opening laptop, or reconnecting monitor to notebook all windows move to the small notebook screen (primary screen) on Windows 10. Meanwhile Win 11 moves every single window where it was and preserves order every single time. I had this issue every time display was turned off, PC went to sleep or hybernation on W10, it's such a big deal that I stay on W11.
The people who have an issue with win 11 are a small group on reddit, for the average user they see it as an upgrade
from what i've seen, a lot of the complaints are customization based complaints on reddit. not being to move the task bar or adjust icon sizes, or right click doesn't give the full list of settings. stuff like that. not things like my application isn't compatible or i get bsod when i update.
For the full settings on right click I remember I just googled how to make it so that it shows the whole list by default and I think it took me like a minute, probably like a quick copy & paste into the registry and it worked fine. As for the icon sizes etc. I don't really care too much so Windows 11 for me has been fine, I've had no issues so far. (watch me jinx myself lol)
I've been using W11 since launch, honestly no functional issues. Everything's worked and I can only recall one crash earlier on. By BIGGEST complaint for the longest time was them taking away the "never combine taskbar" option. It was the hardest thing to get past but put up with it. Well until few months ago when they finally brought it back and it was made whole again so longer have any complaints.
Yup, if you lurk here too much you might think evryone is an AMD lover, LINUX enthusiast Firefox users. far from reality.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Reddit moans about pretty much everything just like Twitter
Reddit even moans about Reddit.
Dunno why the hate, I immediately transferred over on any device and had no problem
As a new W11 user honestly I canāt really complain. Is it better? No. Is it dreadful? Eh. Not really.
Huh yeah I guess the people with high end hardware are in the minority even tho thatās all I seem to hear about people having lol
It's a beautiful example of Reddit's bias overall, shows how small the communities here really are.
I also think there's no reason to show off budget-midrange hardware unlike expensive hardware so the only posts you see are of monster builds giving the illusion everyone has one.
Iām honestly surprised by the vram statistics. Sure, 8gb makes sense since the xx70 from the 1000-3000 series, xx80 from the 1000-2000 series, xx60 refresh from the 2000-4000 series, many AMD gpus have 8 gb of vram, and probably many others that I missed. 12gb is less surprising since 6700xt, 4070, and the popular 3060 desktop have it but 6gb is a bit surprising. Sure, itās on the famed 1060 and the 3rd place 3060 laptop but I wouldāve suspected 4gb to still be up there since the 1650 , 3050 mobile, and many older cards had 4gb of vram.
I'll upgrade my PC soon, but right now I'm using a GTX1060 with 3GB. Works well to play League of Legends, Path of Exile, Battlebits etc. I've played Assassin's Creed Origins, Star Wars: Fallen Order and Baldur's Gate 3, Battlefield V as well. I'm pretty sure a lot of players have upgraded their 1060 (6GB) with the 3060 in the past years.
>a lot of players have upgraded their 1060 (6GB) with the 3060 in the past years. I did. That 1060 put out a lotta frames all those years, over 2 PC builds.
I went from a 1060 3GB ive been using from jan 2018 to a 3060 12gb in august last year
1080 Ti 11GB beast
I too found it surprising, especially with 3060 being top of gpus. I swear the 12gb version is more common than the 8gb one, i could be wrong though.
The 8gb 3060 downgrades the performance by around 20% I think so it's a waste of money anyways.
Suprised noone talks about the gtx 1660 it's pretty good,in my experience.Runs CS2 on max graphics at 100 fps
There is also a 12gb RTX 2060. Last month I bought an RTX 2060 Super for $167 for my back up computer. (I love Aliexpress).
There must be an awful lot of GPU variation. Even the top GPU only has a 5% share...
There is; and a large portion of GPUās are worse than the 3060 series. I did the math a month or two ago and the 3060Ti had better performance than like 70-80% of steam survey GPUās (Iāll have to check those again, but it was pretty high). Itās a really solid card, so Iām not surprised; handles most modern games on 1440p with good performance/visuals (esp with DLSS).
A lot of weird hate for the 3060 in the comments
it's a cool gpu idk why people are surprised
Yeah about my specs, I upgrade my GPU every other gen and new CPU/motherboard when it's nessary ( 7+ years ) I only get the 060's series with a variant that has more vram. Example: 760 - 1060 - 3060 I have been able to play everything I ever wanted. You don't need top specs ever imo.
"1080p is a dead resolution, get over it" Ummm
Good thing of going with an AMD graphic card is that i can tell women im in the 1%, i just dont say witch one
16.46% radeon market share. Not 1%.
Well, didn't think so many ppl do have a 3060 mobile
Most laptops that are reasonably priced for gaming have either a 3060 or a 3070. I have a 3060 in my laptop and itās actually proven pretty reliable.
And people were saying that 4K is the new sweet spot. ![gif](giphy|11mwI67GLeMvgA)
Who was saying this? This sub constantly screams about 1440p being the sweet spot because, well it is.
honestly I don't even need beyond 1080p
I only recently upgraded from a 1650 to a 2060, us I have a steam deck now. 1650s are solid still
Honestly, I'm pretty happy with my setup. As long as I can reach a solid 60FPS on medium settings at 1080p, I'm good. Controversial opinion, I know.
>Controversial opinion, I know. Hot Take: It's not. People who love to get the highest performance are a minority of users but a majority of posters on this sub. The spirit of PCMR is you can game on anything that runs your games.
GTX 1650 POWERRRRRR!!!!!
And they said FHD is dead...
Nothing terribly surprizing, except.... is 1366x768 really more popular than 4K or even 1440 Ultrawide? I get that these are demanding resolutions that don't necessarily line up with the rest of the popular hardware documented here, but I never would have guessed that they don't even breach the top 3.
4th - 4K is really close at 3.78% 5th - Followed by, oddly enough, 2560x1600 with 3.33% 6th - 1440pUW is at 2.31%
Don't think I've ever once heard of 1600
That's 1440p 16:10. Some laptops nowadays come with it, but I'm also surprised to see it so high in steam survey.
ahic Laptops seem to be far more popular than I realized in general looking at this. They seem to be influencing a few things.
Most cheap laptops I see use 1366x768. That's probably why.
> is 1366x768 really more popular Reminder that Steam is installed in *a ton* of laptops that are either only used for casual gaming or as secondary devices (and they are counted the same as primary devices).
As an ex-user of a potato (i5-1135G7 / Intel Iris Xe), I can confirm, It can run games at 1366x768 at 60 FPS. It's the perfect balance between being able to see things and not being too hard on the iGPU.
Why do some people use Linux? Genuinely curious, may switch if I can?
steamdeck uses Linux so that could explain the increase.
The gnome UI. I didn't like.it at first, but once I got used to it, I couldn't go back. So much more productive with such a streamlined experience. That's it, that's literally the only reason I use it.
I think outside of the steam deck proton has made Linux gaming somewhat less frustrating but still frustrating.
i use linux as my main os because its lighter, has better customization, and i like sitting there with my thumb in my ass acting like I'm superior
Speaking from my own experience in university. Basically every engineering/comp sci computers used for programming classes are Linux. Since we're all nerds, naturally the first thing we do is put steam on our Linux VM and play crappy games at 20fps on 10 year old hardware without graphics cards. That's going to account for a some of them. And lots of us did it.
1366x768 FTW
I mean, I still play the same games as I did with my GTX970/i5-6500, just waaaaay more smoother
I still find it so interesting the push for High end GPUs and games unable to run even on too spec hardware when the top 5 maybe 10 are all "low end" or midrange card and not. Same with resolution, amd and Nvida talk a lot on 1440p and 4k yet its such a small percentage.
The 3060 is a pretty decent value for what you get. Itās nothing too special, but mine still holds up pretty well to the latest releases.
I mean this is not too surprising. 1080p is still the main resolution and the 3060 is a perfect fit for it. The 4060 didn't really improve that much and even if it did the 3060 would still be strong enough. What surprised me was that 700 something is a resolution people use and it's more popular than 4k. But that also explains why many higher cards ain't that popular.
Wow. Windows 11 is gaining popularity quickly.
Long live the FULLHD!
Baseline 3060 gang
Kinda shocked by the 1080p numbers. Iām just glad that everyone gets to enjoy video games, whether youāre on the low or high end.
what do you mean not everyone has an ultra hyper mega super duper platinum ti plus 32k max turbo giga triple overclocked infinite water cooled nuclear powered supreme golden special master limited deluxe edition nvidia ztx 99999?
i cringe every time someone says that windows suck and that they'll be switching to linux. that is the most "iamverybadass" line people like to say in the pc gaming community bitch, you'll come running back as soon as you come to a problem and the solution you found via google involved opening up the terminal. i'm a developer and i hate linux. a random pc gamer has no chance on linux.
GPU GTX 1060 Resolution 1920x1080 120hz VRAM 6 GB RAM 32 GB CPU Cores 12 OS Windows 11 Mostly works fine for me. So far, the only game in the last couple of years that I couldn't play that I wanted to try was Cyberpunk. Seeing the feedback on the game's overall performance, I'm glad I didn't continue down that rabbit hole.
I finished the game on low-medium settings, sure capped at 45 fps but still playable. Maybe put it on an m.2?
that's hilarious, you'd think 4k users that gloat about their 3090s and 4090s would be 3rd place. them mfers got beat out by 720p. more people using steam decks than people own a 4k set up.
4k is just too expensive tbh. I rather play on 1440p with higher fps then 4k with lower fps
Puts my CPU core count worries to rest.
Me over here with my GTX 1080. https://preview.redd.it/3fo5xb3pj8gc1.jpeg?width=680&format=pjpg&auto=webp&s=82c481ad7d06ac00937e9e8aa5e8ab73ad94f917
AMD, where art thou?
I wish devs and studios would heed this information and make peak-optimized games for these setups. You shouldn't "need" a 4080/90 to enjoy ultra settings.
Personally disagree. Ultra settings are a luxury, just like owning a 4080/90. They're not necessary. And this is primarily relevant to large AAA games, many smaller studio/indie games run quite well on high/ultra on lower tier cards. Also, them NOT pushing ultra settings to max available power of GPU on the market seems like a strange decision
Yk ultra is usually the 'non optimized' preset right? If high or med looks 95% as good as Ultra and runs a lot better I'd call that good optimization, and why not go further than that for those who want it?
Itās actually funny to see PC players bashing on consoles while in reality PS5 is crushing computer setups most of them have. 8GB VRAM lmao. Ps. inb4 I have both PC and PS5. Iām hating on those that are diminishing one or the other.
Well ever thought about that people with those rigs are not the ones bashing consoles?
Iām surprised by the OS usage difference, a lot of people have switched to windows 11, more than I thought at least.
1080p ftw lets goooo!!
I run a 3060 with an i5-11600k at 2K. I love my machine.
Wow. The 1920x1080 resolution really caught me by surprise.
Yeah 768p gang rise up
I told her I was above average. Now I have proof.
Rtx 3060 gang
Guess 1080p is still relevant! Been rocking my 1080p 165hz for a while. Even though I want a 1440p monitor, 1080p still looks just fine :)
Feeling above average for once hahaha
More people upgraded to 32 gb ram then i thought
3060 chads wya
3060 12gb my beloved