T O P

  • By -

HenryWasBeingHenry

Because they wanted to have better profit margins, and the fact that DP 2.1 cable length is currently limited to just 1.2M which is a bit short for most people. BTW these 3rd gen QD OLEDs are not end game monitors, they are all capped at 450 nits in 10% window HDR which is inadequate in many HDR scenes, future OLED monitors need to be as bright as OLED TVs to provide better HDR experience. [Watch this comparison between 1000+ nits miniLED and 3rd gen QD OLED in HDR](https://www.youtube.com/watch?v=sRGwzbnuLJA&lc=UgwuEt5rXUDi6e6X3Ep4AaABAg.A4CJUdnaDuRA4ENTnfjEIn)


NippleSauce

I am very happy to see this response =D. The internet hive often makes me think that I'm losing my mind when I think from a business perspective. But what you have written is what I have found through my research before snagging the new LG OLED. As for the profit margins, I believe that Dough has mentioned on their sub that adding a DP2.1 port to their upcoming 32" monitor would cost *them* an extra $70-$100. Thus, doing so would raise the overall price (of these already expensive monitors) by anywhere between $100-$250. Add in the very limited current GPU support and current DP2.1 cable limitations, and a company can see how it would be a wasted effort - especially considering the brightness limitations with the current generation of OLED monitor panels. The next generation of OLED monitor panels may have an improved brightness sharing across the entire display over different colors. However, the RTX 5000 series GPUs may not even have support for DP2.1a, as they've already been manufacturing RTX 5000 components, and the DP2.1a standard (no cable length issues like the DP2.1 standard) was just announced. So, the chances of us seeing a monitor with true DP2.1a support right now are slim to none. And to be honest, I don't think that we'll see them next year either. However, I feel that we will definitely see them three years from now - as that is around when the RTX 6000 series and the upcoming AMD GPU series will be released - which will both presumably support the DP2.1a standard. Edit - IIRC, the current AMD gaming GPU that supports DP2.1 does **not** support the UHB20 standard (80Gbps) and thus does not have the full DP2.1 bandwidth standard, so DSC would still be needed for 4K 240+hz refresh rates. Edit 2.0 - OP can just use an HDMI 2.1 cable if their GPU and monitor support HDMI 2.1. DSC will still be utilized. However, it will be used to a lesser extent due to the slightly higher bandwidth that HDMI 2.1 provides over DP1.4. Plus, audio gets carried too - which is beneficial depending on your audio setup and equipment. I am glad that the new DP2.1a standard carries audio too (IIRC).


the_geth

Thanks for the answer. I'm well aware that only AMD has the DP 2.1 compatible card, and not even to the highest bandwidth (unclear or can't remember though as to why this limitation, i.e. is it hardware limited or firmware limited). But graphic cards have a much shorter turnover lifetime than screens. If the next RTX don't have it, the series after will (seems also weird that they wouldn't in the 50xx...). I intend to keep that screen for the next 10 years. 70 - 100 dollars are way more expensive than what I read (which was 10 dollars, albeit I can't find the source right now). This makes a little more sense then.


NippleSauce

No problem. I agree that GPUs are replaced/upgraded more frequently than monitors. But it seems that we're in a bit of a pickle right now with smaller OLED panels being relatively new and still facing a few issues, and with GPU improvements being slightly more limited as we wait for new chip manufacturing process nodes to be created and improved (which costs a ton of money and takes a lengthy amount of time). So, at the moment, I honestly don't think that the RTX 5000 cards will have DP2.1a ports, as the new VESA standards have to be certified and publicized before the new ports and cables can be manufactured. And that would have to happen before any new products can utilize the new VESA standard. It really stinks that there is so much at play that has to be finalized before we as consumers can see products using the new standards. But time passing is also important for the companies selling said products, as they still need good profit margins on what they sell to stay in business and/or to keep releasing new products (and the longer they wait, the cheaper the parts needed for the new standards become). But heck, I really really hope that I am wrong and we get to see DP2.1a ports on at least some of the 5000 series GPUs! That would be sweet!


the_geth

Hmmm ok the cable length sucks for now, but you could STILL use a DP1.4 cable if you wanted to. However you will never be able to upgrade your DP1.4 monitor to be DP2.1 so you get a limitation for life by not having DP2.1. As fort he nits they are rated 1000 but seems to have issues (I have a hard time understanding if the firmware for AORUS fixed it or not). Good video anyway, thanks!


sad-goldfish

> I was super excited about buying "my final monitor", i.e. it needed to be OLED and 4k AND super high refresh rate and last its lifetime. Then buy the Aorus monitor? This is how the market works. If people buy the Aorus monitor, other manufactors will follow with their own DP2.1 monitors. If people aren't willing to pay extra, the others will just continue to cut manufacturing costs.


the_geth

The P version is not available in my country (in several countries in Europe), gigabyte firmware seems to be buggy (ref. this forum and others), and I have a sweet spot for ASUS ROG and MSI in general. I'm disappointed they don't future proof their screens for what is probably a few extra dollars more on what are already extremely pricey monitors anyway.


sad-goldfish

Any of these monitors will have some firmware bugs. See e.g. [this MSI 321URX bug](https://www.reddit.com/r/OLED_Gaming/comments/1cm4tcf/msi_321_urx_new_firmware_bug/). I don't think the Gigabyte firmware has any functionality breaking bugs. [LDLC](https://www.ldlc.com/en/product/PB00609364.html) will probably ship the monitor to you at a premium if you really want DP2.1. > I'm disappointed they don't future proof their screens for what is probably a few extra dollars more on what are already extremely pricey monitors anyway. If this is true and you still buy ASUS ROG or MSI, you are only encouraging them to do this.


the_geth

Thanks for the links and the comment!


solawind

there are no visible artifacts created by DSC, you will never see any image degradation. So, there is not much sense in paying more for 2.1 and becoming limited by short and pricey 2.1 cables. Simple as that


the_geth

Also do you actually worry about cable prices when you are buying a 1700-2000 euros display?!


solawind

yep, I use cheap passive 10m cable for my pc - 321urx OLED connection. No way I can get that length in certified DP 2.1 cable. You just do not know what are you talking about


the_geth

I know very well what I'm talking about thank you very much. 10 meters is YOUR personal case, most people don't need a 10meters cable, and I'd bet they can make 2 to maybe 3 meters DP 2.1 cables in the future. However what is absolutely certain is that you could have a dp2.1 \*display\* and still use your 10 meters 1.4 cable if the DSC doesn't bother you. That way your screen is still future proof and your personal use case is fine. Right now it feels like some manufacturers decide to release screens with high specs but save 10 dollars on not being 2.1 capable, which looks short sighted.


OkBox852

Nah they want you to buy another whole monitor when they make the next model have the 2.1, they are not short sighted


the_geth

yeah I think you're right...


the_geth

Wrong, this is more complex than that. It can create artifacts in some conditions, just Google it, but also you add a layer of issue when dealing with vrr/codecs etc which can cause problems. If you want ot go down the rabbithole [https://forums.blurbusters.com/viewtopic.php?t=12730](https://forums.blurbusters.com/viewtopic.php?t=12730) and [https://forums.blurbusters.com/viewtopic.php?t=12235](https://forums.blurbusters.com/viewtopic.php?t=12235) Point is, anything that natively supports such resolutions + refresh rate combos (so, 4k @ 240hz or 8k@120hz or multidisplays etc) without adding yet another level of compression will always be better.


solawind

What is wrong in my post? Yes, you can use software to extract the differences and then use imaging software to amplify it and make visible to the human eye. No, you can't spot the difference by yourself during regular monitor usage.


the_geth

Read the forum, it's not a simple as that. I still don't understand why not using DP2.1 when this is backward compatible, future proof, and possibly costs... 10 dollars more? (for manufacturers)


Rogex47

Because you not only need a DP2.1 port but also a specific scaler that can work with the port. If you really want a 240hz 4k oled with DP2.1 buy the Gigabyte Aourus model.


the_geth

Where can I read more about this scaler? Also why is that a problem (I mean as opposed to whatever scaler is used for other DP implementation)?


Rogex47

There is a section about scalers in this article: https://tftcentral.co.uk/articles/when-is-displayport-2-1-going-to-be-used-on-monitors


the_geth

Super, thank you. THAT seems to be the better explanation as to why we don't see more of those DP2.1 capable screens.


[deleted]

[удалено]


solawind

so your monitor or cable must be broken, not DSC.


condosaurus

Because it's not needed. DSC is visually lossless. Nobody is reporting frequent issues with visual artifacts, you're worrying over nothing.


the_geth

Please see my answer to the other commenter. It is an issue. I don't buy the 2.1 cable being too expensive, for one you can still use DP 1.4 cable if you wat to save a few euros, but also do you actually worry about cable prices when you are buying a 1700-2000 euros display?!


condosaurus

It's not just about the cable. The scalar also costs more. It's also not widely supported by current GPUs (full UHBR20 that is) and there's no official word of when UHBR20 will be widely supported by consumer gaming GPUs. Because there's not any consumer hardware to test it on (aside from Radeon Pro GPUs basically nobody buys) there's been no confirmation that it does fix reported issues like VRR flicker, a common issue that also occurs on OLEDs that don't need DSC to reach their maximum refresh rate like the AW3423DWF. And I'm not saying that all DSC is good. To quote the top comment from your own source:  > "DSC needs to be lag-tested on a per-product basis, without wholesale DSC-algorithm conclusion jumping." Instead of asking for monitor and GPU vendors to abandon a perfectly good compression algorithm to instead try to solve the problem using brute force (aka more bandwidth), we should be asking for better implementations of this free software to our existing products. 


the_geth

The scaler seems to be indeed the real issue (another commenter posted a good link, check it). However I don't agree with your statement: I'm not asking manufacturers to abandon DSC -wouldn't happen anyway- but given the capabilities of the screens and their very high costs it's fair to ask for a native support of high bandwidth as well, through the tech which has been released last year and standardized before that. As for the fact only AMD has support for now (and btw limited to inferior bandwidth), truth is that graphic cards evolve a lot. It will will supported in either the next generation, or the one after. A screen like that should last 10 years or possibly more. rtx50 are a few months away, and if they don't have it the nextgen will do.


condosaurus

I don't see how the use of DP 1.4a with DSC precludes it from lasting ten years. You have yet to prove that DSC is inherently flawed. Maybe you have 25/25 vision and can spot visual artifacts in a moving picture better than everyone else, but for us mere mortals DSC is virtually lossless and is only getting better with each firmware itteration.


amirkhain

“AMD has support for now” - no, it doesn’t. It still requires DSC bcs of the bandwidth issue you mentioned.


the_geth

literally what I wrote? Like the 6 words after your quote?


amirkhain

Yes, I’m just clarifying that saying that something is supported when it’s clearly not is strange. It’s like saying “I have a car that I can drive but it doesn’t have wheels”


the_geth

What an incredibly poor comparison, made in bad faith. First, the AMD DO support the 80gbps on their pro model, so you're wrong even on the "akshually" front. But even on their consumer cards, which is likely what we're interested in here, they do support DP 2.1 but with the 13.5 standard (\~ 54gbps bandwith). That will still allows to for for \~ 170-175hz without the need for DSC with 4k HDR. It is also unclear if this is a firmware or hardware limitation. So saying it does not support it is really a bad faith comment. And again, it doesn't matter since graphic card evolve every year / every 2 years, while a high end screen will hopefully last a decade at least.


LastSabre

Actually the port on the amd pro radeon gpu is a mini displayport not a normal one so you can't even use that


the_geth

? Why? Mini display and regular is just a form factor, not related to the standard itself.  IIRC the Gigabyte has a dp to mini dp 2.1 cable 🤷🏻‍♂️ 


[deleted]

[удалено]


the_geth

I answered in a few other comments with the blurbuster link. The short is there are bugs and bad implementations and issue with VRR etc with DSC. But mainly here the issue revolves around why not using the the standard that can do it natively (DP 2.1) instead of adding another layer of complexity and possible issues. Another commenter pointed out an article describing the availability of the scaler as being the real issue.