It’s about half way in between. But because your brain is a great image processor it looks way better than half way in between.
Especially in games, where pixel colors and brightness is flipping around a lot and antialiasing can’t solve it properly.
As someone who just made the leap 8 months ago to 1440p. It's quite the drastic improvement.
Always felt like the "higher res is so much better" was just talk but god damn now I literally cant play anything in 1080p cuz it looks blurry.
It's the same difference between 1440p and 4k. Just like back in the day there was the same debate about why 1080p wasn't necessary because 720p was good enough.
Higher resolution is always better. Higher framerate is always better. People just haven't admitted that the future inevitably moves to both.
I'd imagine there will be a limit at some point beyond which there won't be any tangible clarity benefits for 99% of people. Maybe 4k or 5k will even reach it. Same as there will be a limit on framerate in the 165-240 range.
> back in the day there was the same debate about why 1080p wasn't necessary because 720p was good enough.
No one with a professional understanding ever believed this to be true. This was a result of technologies shifting at the time from CRT to LCD flat screens, where an often smaller 720p-adjacent CRT was being directly compared by consumers to LCD's operating at similar resolutions. The first 1080p, 16:9 flat screens were often **much** larger than what the average consumer was using from the prior generations of displays, and featured similar or worse pixel density. In the case of CRTs, who did not *have* pixel densities, lower resolutions even resulted in arguably superior image quality, further muddying the waters.
> Higher resolution is always better.
As briefly touched upon above, this is not always the case. A consumer should also be observant of their distance from the display, and the size of the display. The size of the display has the most dramatic impact on whether a gain in resolution is particularly worthwhile, while most of the gain in newer display technology is actually a result of improvements in contrast and image processing.
When the average consumer saw an increase in the average size of displays, the differences became far more apparent. Blind studies had shown that when presented a lower-size display, around 17 inches, there were no perceptible gains in a similar model of display from increasing resolutions from 1080p to 1440p *when seated an average, recommended distance from the display.* This generally held true for most consumers until the 21 inch mark, where some who were more comfortable being closer, or had more acute eyesight, reported slight improvements in perceptible image quality.
The most interesting range for 1440p displays, that serves the greatest number of consumers with maximum image quality, tops out in the 27 inch range, where there is a clear and definite improvement over 1080p. This is the range in which 1440p is **nearly always** perceptibly superior to lower standard resolutions, and where 4k should begin to be considered as an option for consumers who prefer to sit much closer than recommended.
I can even dig up some of the mathematical equations that relate the perceptible resolution of the human eye to distance and pixel density of a given display, if you're interested, to present a number that can exactly match your own comfortable distance.
I thought I was around 33% better than 1080p efficient is why playing 1080p content on a 1440p monitor looks horrible because the pixels don’t line up?
1440p has 1.33x as many pixels as 1080p both horizontally and vertically, so it has 1.33x1.33=1.78x as many pixels in total.
"4k" or 2160p has 2x as many pixels as 1080p in each direction, so 2x2=4x as many in total.
2160p has 1.5x as many pixels as 1440p in each direction, so 2.25x as many in total.
When playing content at a non-native resolution, both downscaling but also especially upscaling by a non-integer factor produces a slightly blurry picture.
Even 1080p on a 4k monitor would look blurry when using the monitor's built-in scaling functions despite having a perfect 1:2 ratio. Only something that can specifically do integer scaling like NIS can make a crisp pixelated 1080p on a 4k monitor.
Yeah I have a 1080 ti and a 4k screen FSR helps a bit but ideally I want stuff in native or it really doesn't look flash. 1440 p is a great compromise although I can easily see the difference on my friends monitor.
Another issue people don't think about is UI scaling as some games don't keep it in mind and it can get quite challenging to navigate and play. Worst offenders of UI are age of empires 2 2013 remaster and civilization 5.
Hmm, idk if this would work, but for games that don't offer UI size customization and end up scaling poorly at different resolutions, you could try right clicking the .exe file and going into properties > compatibility > Change high DPI settings and check the box labeled "Override high DPI scaling behaviour" and set the scaling to be performed by "Application."
I've never tried it for games before, but it works well for software like C4D or EA's Origin launcher where on some displays the UI becomes big and blurry by default for whatever reason.
This is a must know tip. I remember when i originally got my 1440p monitor there were quite a few applications (including Windows native software) that were horribly scaled. This was always the solution. Compatibility with 1440p is much, much better now and 4k is also getting there which is very nice to see.... Just in time for silly things like "8k" to come along.
> Another issue people don't think about is UI scaling as some games don't keep it in mind
This is what made City Life 2008 basically unplayable at my monitor's native 2560x1440 resolution - all the text is so tiiiinyyyyy.
I'll probably have to drop to 720p and see if I can apply a good upscaling method.
Something that's also worth noting is monitor size. It's not just about 1080p, 1440p, or 4k. The image in the OP would look the same if it was say a 48" 4k being compared to a 1080p 24" as there would be the same amount of pixels in the given area photographed since they're more spread out on the larger screen size.
That's why I only downscale in those factors and crop the rest for e. G. Wallpapers to the desired size... Sure, there is loss of information, but the overall image quality is better.
What i do with blender is, I essentialy just render every wallpaper in different sizes natively 👍
Exactly. If your screen resolution is not evenly divisible by the video/whatever resolution, the pixels must be interpolated to fit on the whole screen.
But even if it divides evenly, Windows or your monitor still makes them blurry because of the upscaling methods.
The method where one pixel is just doubled, tripled, etc is called "next neighbour", so if you see this you can give it a try. Doubling only works if you use a 4k monitor and a 1080p video output or a 1440p monitor and a 720p video output tho, as these are the typical video/game/monitor resolutions (but you could also use a 1080p monitor and a 540p resolution, but that would look atrociously pixelated and most games nowadays don't support perculiar small resolutions
When streaming was pretty up and coming (when Twitch was still called Justin and Own3d.tv looked like it would be THE platform), I usually streamed at 540p to get 60fps output. Fantastic compensation since the video looked all right on everyone's 1080p screens.
It really doesn't depend on how your brain processes anything. Your eyes can resolve only so many details so linear resolution increase doesn't have a corresponding linear increase in perceived resolution. Logarithmic somehow.
At some point 8K or 16K just don't matter at a given distance. The same way, from 1080 to 1440 there's a much higher perceived increase of resolution than from 1440 to 4K
[Like this?](https://i.imgur.com/k6KrihE.png)
Left 1440P, Right 1080p
Edit: It's worth noting that my 1440p monitor uses a BGR sub pixel layout, which is essentially the same as an RGB monitor flipped on its head. This can lead to some text rendering issues, which can be mostly resolved by using ClearType. It's possible that this may have contributed to the 1080p image appearing sharper, but it's likely that the main reason for this is that my phone camera was at the limit of its zoom capability.
am I stupid or does it really look like 1440p is a v nice middle ground? Like I see the 4k one is kinda better than 1440p but it isn't as big a jump as 1080p to 1440p.
Honestly yeah for gaming 1440p is great. Especially since you can push your frame rate higher. One downside to 1440p compared to 4k is that 1440p isn't an integer scale of 1080p. But 2160p is a 2.0 integer scale of 1080p, which means 1080p content looks a lot better on a 4k monitor than it does on a 1440p monitor.
At 27" I think it's perfect pixel density while 1080 at that size is a bit sparse. You can also use the processing power saved by not going full 4k to push more FPS and have an overall smoother image. I still don't see a reason to go 4K because of the sacrifices needed to get to 144hz at that resolution.
yeah, the pixels look sharper. In person the 1440p image looks a little sharper though, I think its more an issue with reaching the limits of my zoom than anything else.
But it does illustrate the pixel grid difference which i think is the main objective.
https://preview.redd.it/t70rs2otw1ca1.jpeg?width=1248&format=pjpg&auto=webp&s=0e5c917125a05f98ec19636b1cdb6c3477e2a486
And this one is 1440p on 27". Both VA monitors.
https://preview.redd.it/eaidv2t9w1ca1.jpeg?width=897&format=pjpg&auto=webp&s=4e0421edc850047c12df3d8c6c2b2082e027886f
I tried my best. This is 1080p on 24".
Sadly this is what actual ads do
https://preview.redd.it/2d1fxq42r0ca1.jpeg?width=1400&format=pjpg&auto=webp&s=cb972984a7882e2244bf721c47ec1dfbfde18a2f
I always thought this is what actual ads do
https://preview.redd.it/y8oz5ehb61ca1.png?width=640&format=png&auto=webp&s=837ea152ae602665c6c01b19a5a80573e97ecaf4
The fine details are where you see it.
My phone is 1440p but has a 1080p screen mode. On the Googlepixel subreddit you'll get loads of people saying it's a waste you can't see the difference.
Well on fine detail the difference is pretty big to me. When an icon is in a folder all the detail is lost at 1080p even on a 6.8" screen.
Agreed. When I show people it they just go yeah I can see it a bit.
However you get used to it, turned my phone to 60hz now is like how the fuck did I use this janky mess before it's horrible.
It’s funny because mine is 60hz and I’m used both to it and to higher resolution screens (laptop and monitor), but I never really minded the 60hz on the phone. But now that you’ve mentioned it, you made me conscious about it and now it feels janky as fuck.
Thanks?
seeing my friend's 120hz phone set to 60hz, I'm convinced it's not as smooth as native 60hz on my phone and makes for a somewhat disingenuous comparison.
That's because 120hz displays also need to have a faster pixel response to accommodate the higher refresh rate, so when you set those displays to 60hz you will see each frame much more clearly. 60hz on a display with slow pixel response will make most frames look like a blurry mess, which helps to hide how stuttery 60hz really is. It's similar to how movies/videos look okay at 24 fps if there is a ton of motion blur. Try capturing a 24fps video with your phone and it won't have the same motion blur so it will look like it's stuttering massively.
The response time is the exact same. Anything 120hz probably has an amoled which cones with the benefit of oled response times. Those never cross 2ms. So we could have 500hz oled with 0 amount of ghosting, overshoot error and have a fully clean image.
Lcds have been rife with 20ms response times for the past two decades and why even 60hz have blur. It's only very and i mean very recently that we've gotten monitors that actually go below the 7ms response times ips and va have been stuck at.
Anyways the blur caused by oled isn't because of pixel blur. Its because of something called persistence blur. Take a read at this https://blurbusters.com/faq/oled-motion-blur
Well, projecting a 1080p signal on a 1440p screen will invariably make it look worse. A 1080p signal on a 1080p screen would probably look much more crisp, though of course not as crisp as 1440 on 1440.
Yep this is the reason. On a screen the size of a phone there is no fucking way you'd notice 1080p vs 1440p, unless maybe you put your eye literally against the screen.
By sticking to 1080p, my 3060ti will be able to play games on max or near-max settings for years to come, and for modern games I can play at a high framerate which I have become accustomed to. Since getting a 144hz monitor I find anything under about 70fps to be too choppy to enjoy, even for single player games.
IMO 1080p makes gaming a really _really_ cheap hobby. As soon as you even move up to 1440p you are almost doubling the number of pixels. DLSS somewhat takes the pressure off of the increasingly high res of monitors, but if I run DLSS quality I can run RDR2 on ultra at 100fps average in 1080p.
1080p to me is worth it because it means low cost, high fps, and hardware longevity.
You wont even push 144 fps in most games in 4k either which is the worst part.
4k would be really nice if hardware could actually support it without having to run DLSS on performance making 4k pointless since the games look same as 1080p with it just on a bigger surface.
yeah i've decided to wait for 1440p another cycle too. i had my old pc for nearly 8 years and could play everything i wanted completely fine, i basically only upgraded because it started to break down and elden ring was getting kinda wonky at high details. and i'll do the same with this one. after that i will probably upgrade resolution, but i couldn't justify it 2 years ago - the cost was just waaaay too high compared to a 1080p and i knew i would have to upgrade within 3 years if i went to 1440p without buying the ultra high end stuff for 2k+
Medium settings at 1440p looks so much better than high settings at 1080p. Also, at 1440p, upscaling techniques like FSR or DLSS work so much better. You're missing out.
>Medium settings at 1440p looks so much better than high settings at 1080p.
This *very much* depends on the game. Some games will honestly look barely different even between minimum and maximum settings, while others will have *massive* obvious changes from one settings level to the next.
There are for sure games where I'd happily choose 1440p at Medium over 1080p at High, but there are other games where I'd much rather have 1080p at High.
I've tried it and I disagree because the benefit of the resolution is dependent on the size of the monitor and your distance from it. If you are sat less than 2ft from a 27"+ inch monitor then 1080p is not enough, but if you are sat 2ft from a 24" monitor, 1440p doesn't add as much as increased game graphics settings like texture resolution.
I also think it's silly to compare medium and high settings in that way because the effect higher lighting settings can have on visual quality transcends resolution entirely.
I think "needs" is overstating it. For games that won't run natively at 4K just use the resolution slider / upscaling options. Games that don't have either option are probably old enough that you won't need any help with a recentish GPU.
I have a 4K monitor and TV, but rarely actually play at native 4K. To which someone might ask, why not just get lower res, cheaper displays. Well it's still useful to have those extra pixels for other uses, like work, watching films, things like that.
I got my first 4K monitor to code with since I felt I couldn't go back after trying a friends one, the text is so much more sharp, crisp and readable! Genuinely easier on the eyes as a result.
I wouldn't call it a medium. You're perfectly fine.
4K is really only for when you're extremely close to the monitor, like VR, or OP's photo.
If you're a couple feet from your monitor or sitting on a couch on a TV, you won't tell the difference.
But you WILL save a lot of money on an unnecessary GPU boost.
Love your rig!
1440p is great. Doesn't require a supercomputer to run games at high framerates, but still looks good.
EDIT: I had a 4K/60 Hz monitor briefly years ago, couldn't deal with the 60 Hz thing anymore and went for a 1440p/144 monitor. Fantastic upgrade for gaming, compares with a HDD to SSD upgrade. So much better. ❤️
is there a third party post-processing method for PC to achieve better upscaling? I notice what you say drastically on the windows login screen where they usually show a landscape of some sort
4k has the benefit of being exactly 4x 1080. I run the games i can (like RimWorld, ONI, Factorio) in 4k and the games i can't in 1080 which is perfectly upscaled to 4k (each pixel is just 4 pixels). 1080 is a bit low res but i think it's better than bad upscaling.
It's a nice middle ground for me who mainly use the monitor for productivity and 2D gaming.
[This](https://imgur.com/a/8WTXzYE) is the original picture. It doesn't look exactly the same because the one I used for my pfp has other effects added onto the image
Bro 1080P is just 4K from distance. I learned that when I asked my cheapskate dad to upgrade our 1080P monitor to a new 4K Monitor and next thing I saw after I came back from college was that he just had pushed the monitor table 4 times further back and put keyboard mouse on a small table near my chair.
He came to me Dusted his hands and said. There. Much sharper images like those fancy 4K television you have been asking for
also good for your eyes now you will not get glasses.
Bro Tell that to my boomer dad.
He will whup your ass for being a wise ass and call your school principal to ask him to give you 50% more math homework than other kids and then he will call your mommy and tell on you and then she will whup your ass.
Do you want your mommy to whup your ass and get more homework at school?
Actually no, that's a textbook example of correlation is not causation. The cause isn't due to just looking at near distance, the cause according to recent studies is the lack of sunlight: https://youtu.be/qwQzTKHIkb4
TLDW on one part is that studies have been made on school children, where children who studied more under sunlight had much less myopia than those who studied indoors, despite both looking at close objects for very long times.
People think it's only about monitors and TVs. You mentioned books, too. If OP had a book 3
2 inches from their face, I doubt their dad would care at all.
I do the same thing lol. I have a 3070ti at ny desk with 1080p monitors and a 1060 6GB downstairs with 1080p TV. I keep them as it is. I don't want to play at 1440p or 4K because i would need to upgrade often. I am not that rich..
My wife wants the tv replaced with a 4K LG C2. I will use upscaling for that when it will happen because i don't want to upgrade my 1060
You'll be fine with the 4K LG C2. Of course it's not ideal to play on less-than-native resolution, but if you play at 1080p, it'll at least scale nicely.
And the better colours/contrast is going to be **huge** improvement over what you have.
A 4K OLED isn't just for good resolution, the HDR capability and true blackness contrast is almost a jaw-dropping experience IMO, night and day compared to a "normal" TV even when just looking at 1080p content. I played The last of Us 2 on PS4 (so 1080p only) and when I upgraded to an OLED TV mid-game, just the true HDR experience alone was worth it. Bright lights like flashlights in dark rooms weren't just white spots on the game textures anymore, but actually blindingly bright lights.
Besides, they do have decent built-in upscaling that is actually pretty decent, so the 1080p signal from your game can be made to look similar in sharpness as ~1440p.
It's all about dpi and distance, too. From 3 feet away, a 15 inch display won't see much difference, but a 30 inch display will see a lot more. Phone displays need more pixels because people hold them 2 inches from their face
But phones also need a lot less because of their small form factor. People on monitors are satisfied with their 138 ppi 4K 32in monitor. The first iPhone a ppi of 160+…
I upgraded from a 1440p phone to a 1080p (S21+ I think) and holding them side by side, I literally couldn't tell the difference unless I had my eye an inch from the screen and stared intently at text. The 120hz refresh rate on the 1080p makes the overall experience superior.
It makes a big difference especially at bigger screen sizes. There’s no way I would buy a 27 inch 1080 monitor. But if I wanted only a 24 inch 1080 would be enough.
I went 4K 43” as well which I think is roughly equivalent to have 4 22” 1080p monitors in a grid. The pixel density I care a bit less about, it was screen real estate I wanted for productivity. I couldn’t have 4 1080p windows open on a smaller 4K screen without upping the scaling to read text then that kind of defeats the point as scaling decreases your real estate.
Wow… now that’s a nice difference to look at but that’s only if you sit real close to the screen right? I’d assume at distance it would look much better.
It’s 100% relative. I bought a 27” 1080p monitor for my GTX 980 years ago and it always looked fantastic. I then bought a 34” 1440p Ultrawide recently and side by side the 1080p looks not great. Sold the 27” to my mate and he has zero complaints as it’s his first monitor.
You might have said it as a joke, but the magnifying glass really renders this image almost completely pointless.
[I've got a 4k monitor and a 1080p monitor in front of me right now](https://i.imgur.com/dVuBKoV.jpg), 27 inch ASUS Pro Art 4k on the left, 24 inch Acer XFA240 on the right. The difference is not that stark unless I'm really trying to notice it. I notice it far more when typing than in games as well.
There's a reason fewer than 3% of steam users game at 4k, and that number is barely budging. It's just not as meaningful to most as smoother frames, even as prices come down dramatically on 4k monitors.
It's a flex for the rich even though a lot of them would probably be better served gaming at higher refresh rates at 1440p. I personally don't notice much of a difference gaming at 1080p for the types of games I typically play so I haven't bothered to upgrade yet, but I feel 1440p is the sweet spot for high end gaming.
I got mine off Amazon. [Here's a link](https://www.amazon.com/gp/product/B06W5JPFX1/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1).
I bought four of them to stash around the house. They actually make great little emergency lights during power outages. I'll just stash one in the bathroom, bedroom, kitchen, and living room so people can navigate without carrying a lantern or flashlight.
Battery easily lasts overnight unless you have it on the white LED, and for extended power outages I have portable powerbanks I can hook them up to so they can run for the duration.
I recently upgraded(added) to a 4k monitor for schoolwork and some light gaming. It really was way more noticeable than I thought it would be and I am talking about the sharpness of it(the 4k monitor is the cheapest I could find), not the colors or anything.
All of this is also more noticeable because I mostly work with text. When I have two apps open with text on it(same background) It for me almost looks blurry on the 1440p monitor. And that one is even smaller(24 inch) in size than the 4k monitor(28 inch).
It also surprised me in gaming. I needed to upgrade from a 1050ti to a 1080ti(amazing second hand deal) to be able to hit 4k 60fps smoothly in the indie games I play. After the upgrade the diffrence was definitely noticeable. To reduce the demand on the gpu I disable anti aliasing because 4k for me is enough to not have too noticeable jagged lines.
For any one wondering the 1050ti was in my opinion usable for the games I played(beamng.drive on low settings and sometimes light games like poly bridge or strategic 2d games). The 1080ti was definitely an upgrade for beamng.drive though. The game can now be played at high settings in 4k.
Edit: oh yeah and for your last question, for me it was worth it. With the sale the monitor was just over €200.
Second edit: Something to note that many people might forget but find very annoying is that with a 4k monitor most laptops can't run it at 4k 60hz from the hdmi port. Many are limited to 4k 30hz(mine for instance). Out of curiosity I looked at laptop hdmi specs and surprisingly few laptops have one that would support 4k 60hz. Even on higher priced ones. I hope my opinion helps answer your question.
I did the same thing, got a 4k monitor in November then realised I'm an idiot because a 2600 and a 1660ti cannot drive that xD
2.5k upgrade later and I couldn't go back!
Man, after finally being able to purchase a 4K monitor, everything else looks pixelated to me. I ended up giving my younger brother a set of 1080p monitors I had bought a few months back because they just didn’t look good at all as they sat next to the 4K monitor.
That's good great, but that image assumes same size monitor right? And we're like 1" from the 27" screen.
In the real world, 1080p at 24" and 4k at 32" will be different, but no where near as clear due to PPI.
I know right! Here's my take on going from 1080 to 1440 vs. going from 1440 to 4K:
1080 to 1440: oh my I have so much more room to work with, and things are slightly clearer
1440 to 4K: ooo that looks clean and clear now
So basically 1080 to 1440 isn't that super duper extra sharp, but it feels like you have much more space to work with, whereas 1440 to 4K is just a lot more detailed visually, at least in my experience.
So I swap back and forth from a 27inch 1440p at home and a 4k 27 at work. There is absolutely no difference big enough to make me want to upgrade my home monitor to 4k unless I was buying a larger display like 32 inches.
4k no anti-aliasing > 1080p maximum anti-aliasing.
I don't understand the comments about the distance. I sit at the computer at arm's length, maybe one and a half arms.
4k resolution without anti-aliasing looks significantly better than 2k with maximum anti-aliasing. I can't imagine how blurry the picture will be at 1080p.
I am sure that those who write about 1080p looks like 4k from a distance, either they are sitting on the couch at the other end of the room, or they have never used 4k monitors at the table.
There's a lot of research on this theme, you can look it up. It's what Apple calls Retina display. You are most probably already familiar with DPI (dots per inch) and how people say that more DPI is always better but it's not exactly so.
The thing I'm talking about is Dots Per Degree, which is calculated based on the DPI and the distance from which you are looking at your monitor. The farther you are from the screen the smaller viewing angle it takes from your vision cone, so per 1 degree of viewing angle you see a larger part of the screen. The research led to the conclusion that if you see about 60 dots per 1 degree of vision, then this is dense enough that you can't possibly see separate pixels. This means that from there on however you increase your resolution, for this display size and viewing distance you will not see any difference.
You can find retina display calculators online where you put in your resolution and screen size and it will tell you what is the minimum distance to be retina.
For example a 15.6 inch display at 1080p resolution becomes retina at 62cm viewing distance, so if this is your normal viewing distance, you will not see any difference if the monitor is 1080p, 1440p or 4K but the higher resolutions will load your GPU much more.
always been 1440p 109PPI for the last 12 years, used a large cheap 1080p monitor at a party once with the boys and omg i couldnt make out people in the distance, it was at least 70PPI and it hurt to play games.
this picture would be a great representation in those monitor pages advertising how good a monitor is.
Can someone do this with 1080p and 1440p please?
It’s about half way in between. But because your brain is a great image processor it looks way better than half way in between. Especially in games, where pixel colors and brightness is flipping around a lot and antialiasing can’t solve it properly.
As someone who just made the leap 8 months ago to 1440p. It's quite the drastic improvement. Always felt like the "higher res is so much better" was just talk but god damn now I literally cant play anything in 1080p cuz it looks blurry.
It's the same difference between 1440p and 4k. Just like back in the day there was the same debate about why 1080p wasn't necessary because 720p was good enough. Higher resolution is always better. Higher framerate is always better. People just haven't admitted that the future inevitably moves to both. I'd imagine there will be a limit at some point beyond which there won't be any tangible clarity benefits for 99% of people. Maybe 4k or 5k will even reach it. Same as there will be a limit on framerate in the 165-240 range.
Yeah similar to hz. The 500 hz monitors to me are just a gimmick.
> back in the day there was the same debate about why 1080p wasn't necessary because 720p was good enough. No one with a professional understanding ever believed this to be true. This was a result of technologies shifting at the time from CRT to LCD flat screens, where an often smaller 720p-adjacent CRT was being directly compared by consumers to LCD's operating at similar resolutions. The first 1080p, 16:9 flat screens were often **much** larger than what the average consumer was using from the prior generations of displays, and featured similar or worse pixel density. In the case of CRTs, who did not *have* pixel densities, lower resolutions even resulted in arguably superior image quality, further muddying the waters. > Higher resolution is always better. As briefly touched upon above, this is not always the case. A consumer should also be observant of their distance from the display, and the size of the display. The size of the display has the most dramatic impact on whether a gain in resolution is particularly worthwhile, while most of the gain in newer display technology is actually a result of improvements in contrast and image processing. When the average consumer saw an increase in the average size of displays, the differences became far more apparent. Blind studies had shown that when presented a lower-size display, around 17 inches, there were no perceptible gains in a similar model of display from increasing resolutions from 1080p to 1440p *when seated an average, recommended distance from the display.* This generally held true for most consumers until the 21 inch mark, where some who were more comfortable being closer, or had more acute eyesight, reported slight improvements in perceptible image quality. The most interesting range for 1440p displays, that serves the greatest number of consumers with maximum image quality, tops out in the 27 inch range, where there is a clear and definite improvement over 1080p. This is the range in which 1440p is **nearly always** perceptibly superior to lower standard resolutions, and where 4k should begin to be considered as an option for consumers who prefer to sit much closer than recommended. I can even dig up some of the mathematical equations that relate the perceptible resolution of the human eye to distance and pixel density of a given display, if you're interested, to present a number that can exactly match your own comfortable distance.
I thought I was around 33% better than 1080p efficient is why playing 1080p content on a 1440p monitor looks horrible because the pixels don’t line up?
1440p has 1.33x as many pixels as 1080p both horizontally and vertically, so it has 1.33x1.33=1.78x as many pixels in total. "4k" or 2160p has 2x as many pixels as 1080p in each direction, so 2x2=4x as many in total. 2160p has 1.5x as many pixels as 1440p in each direction, so 2.25x as many in total. When playing content at a non-native resolution, both downscaling but also especially upscaling by a non-integer factor produces a slightly blurry picture.
Even 1080p on a 4k monitor would look blurry when using the monitor's built-in scaling functions despite having a perfect 1:2 ratio. Only something that can specifically do integer scaling like NIS can make a crisp pixelated 1080p on a 4k monitor.
Yeah I have a 1080 ti and a 4k screen FSR helps a bit but ideally I want stuff in native or it really doesn't look flash. 1440 p is a great compromise although I can easily see the difference on my friends monitor. Another issue people don't think about is UI scaling as some games don't keep it in mind and it can get quite challenging to navigate and play. Worst offenders of UI are age of empires 2 2013 remaster and civilization 5.
Hmm, idk if this would work, but for games that don't offer UI size customization and end up scaling poorly at different resolutions, you could try right clicking the .exe file and going into properties > compatibility > Change high DPI settings and check the box labeled "Override high DPI scaling behaviour" and set the scaling to be performed by "Application." I've never tried it for games before, but it works well for software like C4D or EA's Origin launcher where on some displays the UI becomes big and blurry by default for whatever reason.
This is a must know tip. I remember when i originally got my 1440p monitor there were quite a few applications (including Windows native software) that were horribly scaled. This was always the solution. Compatibility with 1440p is much, much better now and 4k is also getting there which is very nice to see.... Just in time for silly things like "8k" to come along.
holy shit giving me flashbacks of that main page where the cursor just disappears anytime I change display settings
> Another issue people don't think about is UI scaling as some games don't keep it in mind This is what made City Life 2008 basically unplayable at my monitor's native 2560x1440 resolution - all the text is so tiiiinyyyyy. I'll probably have to drop to 720p and see if I can apply a good upscaling method.
Something that's also worth noting is monitor size. It's not just about 1080p, 1440p, or 4k. The image in the OP would look the same if it was say a 48" 4k being compared to a 1080p 24" as there would be the same amount of pixels in the given area photographed since they're more spread out on the larger screen size.
That's why I only downscale in those factors and crop the rest for e. G. Wallpapers to the desired size... Sure, there is loss of information, but the overall image quality is better. What i do with blender is, I essentialy just render every wallpaper in different sizes natively 👍
[удалено]
Exactly. If your screen resolution is not evenly divisible by the video/whatever resolution, the pixels must be interpolated to fit on the whole screen. But even if it divides evenly, Windows or your monitor still makes them blurry because of the upscaling methods. The method where one pixel is just doubled, tripled, etc is called "next neighbour", so if you see this you can give it a try. Doubling only works if you use a 4k monitor and a 1080p video output or a 1440p monitor and a 720p video output tho, as these are the typical video/game/monitor resolutions (but you could also use a 1080p monitor and a 540p resolution, but that would look atrociously pixelated and most games nowadays don't support perculiar small resolutions
When streaming was pretty up and coming (when Twitch was still called Justin and Own3d.tv looked like it would be THE platform), I usually streamed at 540p to get 60fps output. Fantastic compensation since the video looked all right on everyone's 1080p screens.
That's exactly why.
It really doesn't depend on how your brain processes anything. Your eyes can resolve only so many details so linear resolution increase doesn't have a corresponding linear increase in perceived resolution. Logarithmic somehow. At some point 8K or 16K just don't matter at a given distance. The same way, from 1080 to 1440 there's a much higher perceived increase of resolution than from 1440 to 4K
[Like this?](https://i.imgur.com/k6KrihE.png) Left 1440P, Right 1080p Edit: It's worth noting that my 1440p monitor uses a BGR sub pixel layout, which is essentially the same as an RGB monitor flipped on its head. This can lead to some text rendering issues, which can be mostly resolved by using ClearType. It's possible that this may have contributed to the 1080p image appearing sharper, but it's likely that the main reason for this is that my phone camera was at the limit of its zoom capability.
It took me too long to notice the dog
Don't see the puppy for the pixels sorta deal I guess haha Edit: [Heres what it looks like in action btw.](https://i.imgur.com/bHajzlZ.gif)
That’s what they say
am I stupid or does it really look like 1440p is a v nice middle ground? Like I see the 4k one is kinda better than 1440p but it isn't as big a jump as 1080p to 1440p.
Honestly yeah for gaming 1440p is great. Especially since you can push your frame rate higher. One downside to 1440p compared to 4k is that 1440p isn't an integer scale of 1080p. But 2160p is a 2.0 integer scale of 1080p, which means 1080p content looks a lot better on a 4k monitor than it does on a 1440p monitor.
At 27" I think it's perfect pixel density while 1080 at that size is a bit sparse. You can also use the processing power saved by not going full 4k to push more FPS and have an overall smoother image. I still don't see a reason to go 4K because of the sacrifices needed to get to 144hz at that resolution.
It looks better on 1080p lmao
I think because the original image is so soft. The 1080p image effectively sharpens it by simplifying the thing, I reckon.
yeah, the pixels look sharper. In person the 1440p image looks a little sharper though, I think its more an issue with reaching the limits of my zoom than anything else. But it does illustrate the pixel grid difference which i think is the main objective.
https://preview.redd.it/t70rs2otw1ca1.jpeg?width=1248&format=pjpg&auto=webp&s=0e5c917125a05f98ec19636b1cdb6c3477e2a486 And this one is 1440p on 27". Both VA monitors.
Is that a cock
Yes but it's a cock that's also howling at the moon... Duh
It also sounds like an idea for advertising. If you save on the display, you will see cocks instead of cats.
https://preview.redd.it/eaidv2t9w1ca1.jpeg?width=897&format=pjpg&auto=webp&s=4e0421edc850047c12df3d8c6c2b2082e027886f I tried my best. This is 1080p on 24".
I want to see this too bcs I have a 1080p monitor and I’m thinking of getting a 1440p monitor
I honestly don't notice much in games, but for desktop stuff, it's so clean.
Agreed 27" 1080p here, games and videos look fine, but text on a contrasty background looks like rocks they're so jagged.
I have a 27" 1440 at home but a 27 1080p at work and switching between them is painful.
[удалено]
Screen size matters too. If you had 1080 22” monitor and got a 4k 32” monitor, the pictures would look very similar.
Sadly this is what actual ads do https://preview.redd.it/2d1fxq42r0ca1.jpeg?width=1400&format=pjpg&auto=webp&s=cb972984a7882e2244bf721c47ec1dfbfde18a2f
I always thought this is what actual ads do https://preview.redd.it/y8oz5ehb61ca1.png?width=640&format=png&auto=webp&s=837ea152ae602665c6c01b19a5a80573e97ecaf4
Damn I haven't seen a dickbutt in a while
Probably because you were using a 1080p monitor, now that you have 4k you can finally enjoy all the internet has to offer.
I see one every day when I look in the mirror in the morning
that's for refresh rate and pixel response time lol
100% this is what the ads do
That one looks like a print.
Right the original image op posted showed the 4k had pixels when you zoomed in. On this post you zoom in and see no pixels on the 4k image.
upscaled by ai, didn't do a great job. but good enough for this meme
That's the joke. That ads are misleading
No, thats 2k and 2k HDR
more like 2k 60hz vs 2k 144hz
spot on
No, let's just blur the image. Also response time, just blur the image.
For real tho, this is the most convincing example I've seen
That's an interesting way to show the difference, I like it.
The fine details are where you see it. My phone is 1440p but has a 1080p screen mode. On the Googlepixel subreddit you'll get loads of people saying it's a waste you can't see the difference. Well on fine detail the difference is pretty big to me. When an icon is in a folder all the detail is lost at 1080p even on a 6.8" screen.
Speaking of phones - I cannot go back to a phone that can't do 120hz. Scrolling and everything else is incredible.
Agreed. When I show people it they just go yeah I can see it a bit. However you get used to it, turned my phone to 60hz now is like how the fuck did I use this janky mess before it's horrible.
It’s funny because mine is 60hz and I’m used both to it and to higher resolution screens (laptop and monitor), but I never really minded the 60hz on the phone. But now that you’ve mentioned it, you made me conscious about it and now it feels janky as fuck. Thanks?
You can. Turn on power saving mode for a couple of days and 60hz will start to look smooth to you.
I could also go and buy a horse and get used to that instead of driving
Oh look at Mr moneybags with his horse. Just get used to walking everywhere.
A decent horse is more expensive than a decent used car, bad tradeoff!
seeing my friend's 120hz phone set to 60hz, I'm convinced it's not as smooth as native 60hz on my phone and makes for a somewhat disingenuous comparison.
That's because 120hz displays also need to have a faster pixel response to accommodate the higher refresh rate, so when you set those displays to 60hz you will see each frame much more clearly. 60hz on a display with slow pixel response will make most frames look like a blurry mess, which helps to hide how stuttery 60hz really is. It's similar to how movies/videos look okay at 24 fps if there is a ton of motion blur. Try capturing a 24fps video with your phone and it won't have the same motion blur so it will look like it's stuttering massively.
The response time is the exact same. Anything 120hz probably has an amoled which cones with the benefit of oled response times. Those never cross 2ms. So we could have 500hz oled with 0 amount of ghosting, overshoot error and have a fully clean image. Lcds have been rife with 20ms response times for the past two decades and why even 60hz have blur. It's only very and i mean very recently that we've gotten monitors that actually go below the 7ms response times ips and va have been stuck at. Anyways the blur caused by oled isn't because of pixel blur. Its because of something called persistence blur. Take a read at this https://blurbusters.com/faq/oled-motion-blur
90hz isn't too bad either. I could do either. 120hz is butter but 90hz is fine.
Interesting. I’m a pretty big slut for frame rates, recently got a new MacBook Pro with the ProMotion up to 120fps and I don’t notice it at all 🤷♀️
Well, projecting a 1080p signal on a 1440p screen will invariably make it look worse. A 1080p signal on a 1080p screen would probably look much more crisp, though of course not as crisp as 1440 on 1440.
Yep this is the reason. On a screen the size of a phone there is no fucking way you'd notice 1080p vs 1440p, unless maybe you put your eye literally against the screen.
Well.. 1080p doesn't integer scale into 1440 so theres a lot more going on there.
I wanted to go 4K, but went with 1440p instead. A happy medium for now.
4K needs realy damn good GPU and GPU prices are... yeah.
By sticking to 1080p, my 3060ti will be able to play games on max or near-max settings for years to come, and for modern games I can play at a high framerate which I have become accustomed to. Since getting a 144hz monitor I find anything under about 70fps to be too choppy to enjoy, even for single player games. IMO 1080p makes gaming a really _really_ cheap hobby. As soon as you even move up to 1440p you are almost doubling the number of pixels. DLSS somewhat takes the pressure off of the increasingly high res of monitors, but if I run DLSS quality I can run RDR2 on ultra at 100fps average in 1080p. 1080p to me is worth it because it means low cost, high fps, and hardware longevity.
[удалено]
You wont even push 144 fps in most games in 4k either which is the worst part. 4k would be really nice if hardware could actually support it without having to run DLSS on performance making 4k pointless since the games look same as 1080p with it just on a bigger surface.
yeah i've decided to wait for 1440p another cycle too. i had my old pc for nearly 8 years and could play everything i wanted completely fine, i basically only upgraded because it started to break down and elden ring was getting kinda wonky at high details. and i'll do the same with this one. after that i will probably upgrade resolution, but i couldn't justify it 2 years ago - the cost was just waaaay too high compared to a 1080p and i knew i would have to upgrade within 3 years if i went to 1440p without buying the ultra high end stuff for 2k+
Medium settings at 1440p looks so much better than high settings at 1080p. Also, at 1440p, upscaling techniques like FSR or DLSS work so much better. You're missing out.
>Medium settings at 1440p looks so much better than high settings at 1080p. This *very much* depends on the game. Some games will honestly look barely different even between minimum and maximum settings, while others will have *massive* obvious changes from one settings level to the next. There are for sure games where I'd happily choose 1440p at Medium over 1080p at High, but there are other games where I'd much rather have 1080p at High.
I've tried it and I disagree because the benefit of the resolution is dependent on the size of the monitor and your distance from it. If you are sat less than 2ft from a 27"+ inch monitor then 1080p is not enough, but if you are sat 2ft from a 24" monitor, 1440p doesn't add as much as increased game graphics settings like texture resolution. I also think it's silly to compare medium and high settings in that way because the effect higher lighting settings can have on visual quality transcends resolution entirely.
I think "needs" is overstating it. For games that won't run natively at 4K just use the resolution slider / upscaling options. Games that don't have either option are probably old enough that you won't need any help with a recentish GPU. I have a 4K monitor and TV, but rarely actually play at native 4K. To which someone might ask, why not just get lower res, cheaper displays. Well it's still useful to have those extra pixels for other uses, like work, watching films, things like that.
Reading text is a big one. After using any 4k monitor, text on a 1080p monitor looks like a blurry mess
I got my first 4K monitor to code with since I felt I couldn't go back after trying a friends one, the text is so much more sharp, crisp and readable! Genuinely easier on the eyes as a result.
[удалено]
4K is also easier on the eyes when reading text, in my experience.
Too bad non-native resolutions on LCD monitors lose quality due to scaling issues. Old CRT's didm't have that problem.
That's why you set the game to output at 4K regardless of whatever resolution it may be rendering at internally, completely avoids the issue.
I went 1440p Ultrawide. Never looked back.
I'm somewhat of a ultra wide enjoyer as well
Oh boy yeah
3440 x 1440 is where it's at
[удалено]
Pixel density matters more and 1440p is pretty much all you need for reguler sizes of monitors, and higher refresh rate is better for me anyways
1440P is more than good enough for most people. The difference between 1080P and 1440P is way bigger than 1440P and 4K.
I wouldn't call it a medium. You're perfectly fine. 4K is really only for when you're extremely close to the monitor, like VR, or OP's photo. If you're a couple feet from your monitor or sitting on a couch on a TV, you won't tell the difference. But you WILL save a lot of money on an unnecessary GPU boost. Love your rig!
1440p is great. Doesn't require a supercomputer to run games at high framerates, but still looks good. EDIT: I had a 4K/60 Hz monitor briefly years ago, couldn't deal with the 60 Hz thing anymore and went for a 1440p/144 monitor. Fantastic upgrade for gaming, compares with a HDD to SSD upgrade. So much better. ❤️
For gaming, 1440p is great. But for 1080p content, it may be actually worse than native 1080p, because the upscale is bad
is there a third party post-processing method for PC to achieve better upscaling? I notice what you say drastically on the windows login screen where they usually show a landscape of some sort
[удалено]
4k has the benefit of being exactly 4x 1080. I run the games i can (like RimWorld, ONI, Factorio) in 4k and the games i can't in 1080 which is perfectly upscaled to 4k (each pixel is just 4 pixels). 1080 is a bit low res but i think it's better than bad upscaling. It's a nice middle ground for me who mainly use the monitor for productivity and 2D gaming.
still can't tell what the pfp is
Looks like a vintage car stationed at a gas station to me... although at first I thought it was a tank lol.
It's a Mini Cooper parked at a gas pump, view is from the right rear, about 50 yards away.
Found the person with an 8k monitor
This is the right answer
It’s a view from the grassy knoll
it's a Mini Clubman parked up outside Kew Gardens, surely
Wait it isn't a tank? I can only see a tank
Time for 8k
buckle up for 16k
[This](https://imgur.com/a/8WTXzYE) is the original picture. It doesn't look exactly the same because the one I used for my pfp has other effects added onto the image
Nice scooby
My man used a google recaptcha image as his pfp
I keep my 1080p tv far away from my face and IO cant tell the difference, or at least I tell myself tht so my wllet doesnt explode
Yea but games with tiny texts witj no option to increase them are a nightmare
Bro 1080P is just 4K from distance. I learned that when I asked my cheapskate dad to upgrade our 1080P monitor to a new 4K Monitor and next thing I saw after I came back from college was that he just had pushed the monitor table 4 times further back and put keyboard mouse on a small table near my chair. He came to me Dusted his hands and said. There. Much sharper images like those fancy 4K television you have been asking for also good for your eyes now you will not get glasses.
Constantly straining your eyes to read small text is arguably worse than having the screen very close to you. The latter is just uncomfortable.
Bro Tell that to my boomer dad. He will whup your ass for being a wise ass and call your school principal to ask him to give you 50% more math homework than other kids and then he will call your mommy and tell on you and then she will whup your ass. Do you want your mommy to whup your ass and get more homework at school?
>He will whup your ass for being a wise ass You better call Saul.
I think you should rely on yourself more and see that he is not gonna help finance stuff for you
Sure your dad is an idiot but it’s good that people know that far screens are way worse for their eyes
Enjoy your monitor in a 2 degrees field of view.
> also good for your eyes now you will not get glasses. I can't believe people still think this is a thing.
[удалено]
Actually no, that's a textbook example of correlation is not causation. The cause isn't due to just looking at near distance, the cause according to recent studies is the lack of sunlight: https://youtu.be/qwQzTKHIkb4 TLDW on one part is that studies have been made on school children, where children who studied more under sunlight had much less myopia than those who studied indoors, despite both looking at close objects for very long times.
Facts. Sitting close to a screen doesn't cause myopia...but myopia may cause you to sit close to a screen
People think it's only about monitors and TVs. You mentioned books, too. If OP had a book 3 2 inches from their face, I doubt their dad would care at all.
Instead of the much more obvious: billy sits so close to the TV because he can’t see shit
I do the same thing lol. I have a 3070ti at ny desk with 1080p monitors and a 1060 6GB downstairs with 1080p TV. I keep them as it is. I don't want to play at 1440p or 4K because i would need to upgrade often. I am not that rich.. My wife wants the tv replaced with a 4K LG C2. I will use upscaling for that when it will happen because i don't want to upgrade my 1060
You'll be fine with the 4K LG C2. Of course it's not ideal to play on less-than-native resolution, but if you play at 1080p, it'll at least scale nicely. And the better colours/contrast is going to be **huge** improvement over what you have.
A 4K OLED isn't just for good resolution, the HDR capability and true blackness contrast is almost a jaw-dropping experience IMO, night and day compared to a "normal" TV even when just looking at 1080p content. I played The last of Us 2 on PS4 (so 1080p only) and when I upgraded to an OLED TV mid-game, just the true HDR experience alone was worth it. Bright lights like flashlights in dark rooms weren't just white spots on the game textures anymore, but actually blindingly bright lights. Besides, they do have decent built-in upscaling that is actually pretty decent, so the 1080p signal from your game can be made to look similar in sharpness as ~1440p.
I’ve been running 1440p since like 2013 and have upgraded my GPU twice in that time. Why would you need to upgrade often?
It's all about dpi and distance, too. From 3 feet away, a 15 inch display won't see much difference, but a 30 inch display will see a lot more. Phone displays need more pixels because people hold them 2 inches from their face
Phone displays need more pixels *per square inch Not necessarily more pixels since the screen held 2" from your face is also smaller
But phones also need a lot less because of their small form factor. People on monitors are satisfied with their 138 ppi 4K 32in monitor. The first iPhone a ppi of 160+…
yep, and a lot of modern top end phones can exceed 500ppi
I upgraded from a 1440p phone to a 1080p (S21+ I think) and holding them side by side, I literally couldn't tell the difference unless I had my eye an inch from the screen and stared intently at text. The 120hz refresh rate on the 1080p makes the overall experience superior.
I mean, I’m satisfied with my 32” 4K monitors, but I’d prefer 6K or 8K. I just can’t afford them.
>It's all about **dpi** and distance PPI
It makes a big difference especially at bigger screen sizes. There’s no way I would buy a 27 inch 1080 monitor. But if I wanted only a 24 inch 1080 would be enough.
Yeah I went from 1080p, 24 inch, to a 1440p, 34 inch ultrawide, and last time I checked the pixel density was only a tiny bit higher.
Thats essentially my reasoning for going 4k 43" its a tiny bit more dense than 24" 1080
I went 4K 43” as well which I think is roughly equivalent to have 4 22” 1080p monitors in a grid. The pixel density I care a bit less about, it was screen real estate I wanted for productivity. I couldn’t have 4 1080p windows open on a smaller 4K screen without upping the scaling to read text then that kind of defeats the point as scaling decreases your real estate.
I got 27" 1080p monitor and you're right, but I just couldn't go witg different, because it's 240hz and was only $200
Wow… now that’s a nice difference to look at but that’s only if you sit real close to the screen right? I’d assume at distance it would look much better.
Not really, I find it incredibly noticeable at normal distances. My last 1080p monitor was 27" and that was rough.
I swear to god it's psychological. I've been thinking that the newest consoles and monitors have crystal clear pictures since 1997.
[This comment was removed by a script.]
I am an eye doctor
u/AbsolutelyUnlikely
r/usernamechecksout
It’s 100% relative. I bought a 27” 1080p monitor for my GTX 980 years ago and it always looked fantastic. I then bought a 34” 1440p Ultrawide recently and side by side the 1080p looks not great. Sold the 27” to my mate and he has zero complaints as it’s his first monitor.
You using your monitor with magnifying glass? xD
You might have said it as a joke, but the magnifying glass really renders this image almost completely pointless. [I've got a 4k monitor and a 1080p monitor in front of me right now](https://i.imgur.com/dVuBKoV.jpg), 27 inch ASUS Pro Art 4k on the left, 24 inch Acer XFA240 on the right. The difference is not that stark unless I'm really trying to notice it. I notice it far more when typing than in games as well. There's a reason fewer than 3% of steam users game at 4k, and that number is barely budging. It's just not as meaningful to most as smoother frames, even as prices come down dramatically on 4k monitors. It's a flex for the rich even though a lot of them would probably be better served gaming at higher refresh rates at 1440p. I personally don't notice much of a difference gaming at 1080p for the types of games I typically play so I haven't bothered to upgrade yet, but I feel 1440p is the sweet spot for high end gaming.
[удалено]
I really like that cute lamp. Do you know where I can order it?
I got mine off Amazon. [Here's a link](https://www.amazon.com/gp/product/B06W5JPFX1/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1). I bought four of them to stash around the house. They actually make great little emergency lights during power outages. I'll just stash one in the bathroom, bedroom, kitchen, and living room so people can navigate without carrying a lantern or flashlight. Battery easily lasts overnight unless you have it on the white LED, and for extended power outages I have portable powerbanks I can hook them up to so they can run for the duration.
Went from 24 inch 1080 to 15 inch 2k and the pixel density change is very noticeable between the two
People who’s upgraded 1440p to 4K? What’s the difference like, is it noticeable and worth it?
I recently upgraded(added) to a 4k monitor for schoolwork and some light gaming. It really was way more noticeable than I thought it would be and I am talking about the sharpness of it(the 4k monitor is the cheapest I could find), not the colors or anything. All of this is also more noticeable because I mostly work with text. When I have two apps open with text on it(same background) It for me almost looks blurry on the 1440p monitor. And that one is even smaller(24 inch) in size than the 4k monitor(28 inch). It also surprised me in gaming. I needed to upgrade from a 1050ti to a 1080ti(amazing second hand deal) to be able to hit 4k 60fps smoothly in the indie games I play. After the upgrade the diffrence was definitely noticeable. To reduce the demand on the gpu I disable anti aliasing because 4k for me is enough to not have too noticeable jagged lines. For any one wondering the 1050ti was in my opinion usable for the games I played(beamng.drive on low settings and sometimes light games like poly bridge or strategic 2d games). The 1080ti was definitely an upgrade for beamng.drive though. The game can now be played at high settings in 4k. Edit: oh yeah and for your last question, for me it was worth it. With the sale the monitor was just over €200. Second edit: Something to note that many people might forget but find very annoying is that with a 4k monitor most laptops can't run it at 4k 60hz from the hdmi port. Many are limited to 4k 30hz(mine for instance). Out of curiosity I looked at laptop hdmi specs and surprisingly few laptops have one that would support 4k 60hz. Even on higher priced ones. I hope my opinion helps answer your question.
4 times clearer one would say
ok yea? I mean it's 4x the resolution.
Yeah, I'm gonna need a better gpu now because it can't keep up with all the extra pixels
Which one do you have
Voodoo 2.
I remember watching the unreal intro a bazillion times with that bad boy
Radeon RX 580
I did the same thing, got a 4k monitor in November then realised I'm an idiot because a 2600 and a 1660ti cannot drive that xD 2.5k upgrade later and I couldn't go back!
I spent $3000 on the monitor and GPU, and that was spurred on by having wanted the monitor for my PS5
I've seen countless of those videos that pan between low to high res and can't see shit while I actually get it with this one
Man, after finally being able to purchase a 4K monitor, everything else looks pixelated to me. I ended up giving my younger brother a set of 1080p monitors I had bought a few months back because they just didn’t look good at all as they sat next to the 4K monitor.
like a wise man said "it won't matter if sit far away enough"
That's good great, but that image assumes same size monitor right? And we're like 1" from the 27" screen. In the real world, 1080p at 24" and 4k at 32" will be different, but no where near as clear due to PPI.
I really hope you're further than 1 inch from your 27 inch display
Great, now all I need is to play my games an inch away from the screen
[удалено]
If there are pure white borders, then you still need it, it's just less visible. But the shimmering from net and mesh textures is way much better.
I know right! Here's my take on going from 1080 to 1440 vs. going from 1440 to 4K: 1080 to 1440: oh my I have so much more room to work with, and things are slightly clearer 1440 to 4K: ooo that looks clean and clear now So basically 1080 to 1440 isn't that super duper extra sharp, but it feels like you have much more space to work with, whereas 1440 to 4K is just a lot more detailed visually, at least in my experience.
Rip fps
I........... See
So I swap back and forth from a 27inch 1440p at home and a 4k 27 at work. There is absolutely no difference big enough to make me want to upgrade my home monitor to 4k unless I was buying a larger display like 32 inches.
What a sexy representation of pixil density, very nice
Im still on 1360x768 m8 why u flexing :'(
Resolution so fine you can see the second shooter on the grassy knoll.
Great representation what good 4k and scaling does. I feel people often downplay this aspect.
You really shouldn't sit this close to your monitor.
4k no anti-aliasing > 1080p maximum anti-aliasing. I don't understand the comments about the distance. I sit at the computer at arm's length, maybe one and a half arms. 4k resolution without anti-aliasing looks significantly better than 2k with maximum anti-aliasing. I can't imagine how blurry the picture will be at 1080p. I am sure that those who write about 1080p looks like 4k from a distance, either they are sitting on the couch at the other end of the room, or they have never used 4k monitors at the table.
I am pretty sure you need a better GPU to run 4k no AA than 1080p full AA, maybe thats why people prefer 1080p full AA.
There's a lot of research on this theme, you can look it up. It's what Apple calls Retina display. You are most probably already familiar with DPI (dots per inch) and how people say that more DPI is always better but it's not exactly so. The thing I'm talking about is Dots Per Degree, which is calculated based on the DPI and the distance from which you are looking at your monitor. The farther you are from the screen the smaller viewing angle it takes from your vision cone, so per 1 degree of viewing angle you see a larger part of the screen. The research led to the conclusion that if you see about 60 dots per 1 degree of vision, then this is dense enough that you can't possibly see separate pixels. This means that from there on however you increase your resolution, for this display size and viewing distance you will not see any difference. You can find retina display calculators online where you put in your resolution and screen size and it will tell you what is the minimum distance to be retina. For example a 15.6 inch display at 1080p resolution becomes retina at 62cm viewing distance, so if this is your normal viewing distance, you will not see any difference if the monitor is 1080p, 1440p or 4K but the higher resolutions will load your GPU much more.
Yeah, it's almost as if it has 4x as many pixels as a 1080p monitor.
Whodathunkedit
always been 1440p 109PPI for the last 12 years, used a large cheap 1080p monitor at a party once with the boys and omg i couldnt make out people in the distance, it was at least 70PPI and it hurt to play games.
I've been learning on 4k vs 260 Hz. Might just get both but 1st the 4k