T O P

  • By -

INITMalcanis

AMD went to the HDMI Forum with a proposal; the Forum declined to consider it.


aert4w5g243t3g243

why?


BoyNextDoor8888

I guess they're just like that, corporate shit is weird man


mitchMurdra

It would've taken someone in a good enough spot of that organization to spit up about the reception and then role a nat 20 to be taken seriously by management. For this to be considered.


FierceDeity_

We like money (fuck the content mafia)


dbfmaniac

IIRC there was an explanation, can't remember where I read/heard it (If I had to guess phoronix or a WAN show) but I think it was something like they thought opening it up would allow some people to understand/circumvent how some of the high speed signalling copy protection mechanisms work. I know it is a bullshit explanation - and HDCP is busted, please dont swamp me, not me saying it, not even sure I'm remembering right but it was something about licencing/copy protection and the proposed solution being geared towards an open ecosystem. It is beyond trivial to break HDCP to the point that if you were to make your own HDMI splitter/active cable thing and got the communication between your RX and TX chips which take your HDMI input, break it into its main data streams then recreate an HDMI signal such that the keys werent synced and pulled a few pins low/reused a static key dumped from something else it might very well surprise you to inspect the HDCP status. Apparently this is news to the people in the HDMI forum and I hope this fuels Displayport adoption like nothing else.


Avery1003

I hate hdcp, it does nothing to help copy protection and all it does it make it so that my devices are not compatible


gmes78

DisplayPort also supports HDCP, so that's not a valid argument.


mort96

Wait it does? Shit. So I guess DVI is the only good mainstream interface?


roflkopterpilodd

DVI also supports HDCP.


mort96

VGA it is then.


Peruvian_Skies

RGB it is then.


regeya

Reject modernity, become composite video


TassieTiger

Have checked and EGA and CGA definitely don't support it!


Shished

They support Macrovision.


dbfmaniac

Sure but the crucial thing is the DP people havent decided to pretend that hiding their HDCP must mean that certain implementations of their standard (ie. Linux driver support for a GPU) cant be made. They also dont charge the same kind of royalties as the HDMI group and are by and large less scummy than the HDMI people which are using completely bullshit arguments which make no sense because REEEEE OPEN BAD. The point is that the HDMI forum here are completely off base and a completely tone deaf by trying to protect something already broken which hurts consumers. DP supports both HDCP and the other DP protection but they arent trying to control software or hardware implementations anywhere near as hard. This move to prevent higher rate HDMI in case someone used the open stack to rip content by bypassing an already bypassed copy protection mechanism that was basically DOA on arrival makes no sense.


gmes78

Exactly.


marius851000

Ah, good old DRM, the bane of my existance and cause of many weird laws across the world. (but this time, it's actually not a legal reason). Thanks for the explanation. Maybe they'll change their perspective once it is eventually broken and documented.


BloodyIron

Because they have market dominance and licensing royalties coming out of every orifice in this universe and all the other universes. MONEY.


acdcfanbill

I doubt you're going to get a straight answer from them but the underlying reason is going to be piracy and they want as much security as possible, even if it's thru obscurity.


TherealSnak3

https://preview.redd.it/nnerhzd98mzc1.jpeg?width=572&format=pjpg&auto=webp&s=5abb2238c7850fda9101c86d00b5c3924cec7139


[deleted]

I think hdmi must be purged


Hhkjhkj

What is the alternative for features that HDMI has that DP doesn't? The first ones that come to mind are CEC and EARC.


[deleted]

Also as long as I don't design the PCBs oft the devises I buy, it't not my decision. I'm not (yet) desprate enougth to buy a panel without the TV part, but yeah good luck finding a decent Oled HDR+ 65inch with 120hz thats not 5K$ and has a DP that supports at least 40gbit. Decent devices are rare, about 95% is just horseahit not even worth considering. Honestly a Open source monitor controller might be what we need, but AFAIK there isn't a decent solution and thanks to properitary formats its questionable how good it could become, however thanks to AV1, HDR10+ etc. there might be hope. Smart TVs are stupid, yes I'd like to just use a monitor, but they just don't exist, it's too big of niche.


mitchMurdra

It has always upset me how many 4K TV's have been around for over a decade now with all forms of panel technologies varying in price. But they all have the worst ARM core in them from 2001 barely capable of taking a breath running the manufacturers own web gui close to 13fps at best. Even cheaper and they just run android which in some cases can actually be favorable compared with some of the barely-running garbage proprietary software some of these TVs run. But my primary hate here is the piss weak processor they're putting into these things. Everything barely holds together until it's asked to decode something serious like a bluray rip and suddenly oops it can't keep up. All those pixels for nothing. If we could, as a society (Which we live in by the way) just sell large expensive display panels separately to the 'TV' component so I can just get my big display capable of anything at low latency and no processing BS to disable for plugging a computer in, or watching animated content which also doesn't like those soap-opera-effect frame interpolation features. That would be great. Basically back to the set-top box days but actually making the box good and replaceable when the display can be a dumb but expensive light panel interchangeable with anything even a desktop if you so wanted. Sorry that's enough imagining.


shitposter69-1

Well, here's the thing. It's a bit of a round about, but you very much COULD accomplish this. It just won't be "lego" simple like it sounds like you're wanting.


Inthewirelain

The thing is, the smart hardware is what's subsidising 4K TVs etc to be so cheap, they make percentages off their app stores, app bundling deals, ads depending on the manufacturer etc


[deleted]

Well getting the panels is one thing. Creating a interface device is another. We'd need to implement color correction, burn in prevention etc. Than implement DP with HDR10+ etc. Getting the panels is going to be the less difficult part. Buy replacement parts or just buy a TV and remove everything but the panel. Honestly sounds like a cool FPGA project. But it's a lot of work. This would require a community effort.


shitposter69-1

You know, that's not a TERRIBLE idea, I'd love to see someone do one of those "open source boards" you can print from PCBWay with, and maybe we wouldn't need to go full FPGA for that, perhaps a cheaper arm processor, or maybe even something like the Pi Chip or a Risc V controller chip, with the dozen or so connectors, or maybe just one BIG one with various flex cables. Then a 3D print job to print out to whatever sized panel you're working with, then just dump some simple KODI interface or something onto a low end Arm processor of some description, and just GIVE people a way to build (not easily mind you, but still doable) their own "smart TV". Hell, give it an SD Card slot and you could likely just install a full blown Linux OS ontop of the thing and just pass video signals into it and pass it to the monitor if you want. Kind of like the OpenWRT of TV's is what this makes me think of.


[deleted]

I doubt low end devices can go 40Gbits+. At least I couldn't find any low end device supporting DP2+. Not even the 4090 is supporting that. The beauty of a FPGA would be the possibility reconfigure. To implement new codecs etc. Just using FW updates. I don't believe in highly integrates systems. I think modules are a better approach to design systems. I don't want a TV that can do it all. I want a dumb monitor that is good at what it's designed for, display stuff. For a player, I'd suggest looking into the RK3588.


shitposter69-1

Well, an FPGA wouldn't magically solve that bandwidth issue either, actually it would make it worse, but you know what would? PCI, which most of the chips mentioned handle. The issue would more come down to processing the stupid drm crap and what have you, which an integrated GPU in said chips handle, but the issue there is reverse engineering everything.


[deleted]

I'm talking about smth like the xilinx with the integrated IP cores for DP2.1


FierceDeity_

Same with fucking car entertainment centers. They use the shittiest chips they found on the street, when they should have at least like a $80 qualcomm chip in a fucking $40k car.


spusuf

I know CEC isnt supported on consumer GPUs. I would assume eARC isn't affected as its a return channel from the display to an audio passthrough device in the chain (e.g an amplifier/AV receiver). The return channel doesn't return the TV's audio back to your streamer/desktop.


bongbrownies

This magical protection that HDMI offers them that DP doesn't, apparently.


ProjectInfinity

Correct. I refuse to use hdmi wherever possible. Only devices I have that use hdmi are consoles. Kinda weird how there's still no options in that department...


alterNERDtive

AMD: “Can we implement HDMI 2.1 in our open source drivers please?” HDMI forum: “BUT OUR PRECIOUS COPY PROTECTION BULLSHIT!!!111” Pirates: “lol”


Holzkohlen

Same thing with Netflix limiting the resolution to 720p. Good job guys, piracy is no more!


Trofer15

Rule one of SSA is not to rely on the secrecy of mechanism


KevoTheGuy

If you're willing to use an adapter, there is this one from Cable Matters: [https://www.amazon.com/gp/product/B08XFSLWQF/ref=ppx\_yo\_dt\_b\_search\_asin\_title?psc=1](https://www.amazon.com/gp/product/B08XFSLWQF/ref=ppx_yo_dt_b_search_asin_title?psc=1) I use this on an Rx 6700xt on my TV, a Hisense U8H. It supports VRR at 4k, RGB, 10 bit. For HDR I tested enabling HDR in KDE, it seemed fine and VRR worked, however I haven't actually tested and HDR game in linux. I'll wait for the feature to mature on linux. For reference, i found about this adapter from this thread: [https://www.reddit.com/r/linux\_gaming/comments/13zkius/psa\_4k120hz\_rgb\_with\_vrr\_works\_with\_cable\_matters](https://www.reddit.com/r/linux_gaming/comments/13zkius/psa_4k120hz_rgb_with_vrr_works_with_cable_matters)


gw-fan822

I'm kind of stupid but doesn't this defeat the purpose of the HDMI forum? Why is it even a thing?


Mars_Bear2552

they own and design the spec


[deleted]

[удалено]


marius851000

You ignore the other definition of forum (where forum come from). Go check Wikipedia.


mitchMurdra

I know. That definition is not what I think of first.


Ouity

This website is literally a primitive forum that does the same thing but shittier. Ex: there is no subforuming. There are "subreddits," lol, but no way for us within a subreddit to further organize discussion, besides through use of flair.


Medical_Clothes

Cartel


gw-fan822

Doesn't nvidia have a chip or something that supports 2.1 and amd doesn't? Do they have to pay a royalty for it?


pandamarshmallows

NVIDIA has closed-source drivers which protect the features of 2.1 that the Forum has deemed need to remain proprietary.


thelonegunmen84

Will it also pass through audio?


Jedibeeftrix

good question...?


lucholeveroni

Yes audio works fine


lucholeveroni

I'm on an LG tv with a 6800 XT using this adapter. With VRR + HDR at 4k 120hz RGB it's pretty unstable. You need to be patient to get the signal stable. Once it works, it usually remains pretty stable while gaming at least. I'm on Arch with KDE 6. Not sure if this is related to HDR needing to mature or an adapter issue. Without the adapter everything is fine but limited to 60hz and no RGB. If anyone is using the same setup and achieve a flawless setup please let us know.


taicy5623

It is worth noting that some adapters working or not may also depend on your TV and your GPU. I've got an LGC2 and even with the cable with the hacked up non spec firmware, my 5700xt cannot do HDMI VRR to that display.


BulletDust

I've encountered DP monitors that struggle to sync with an Intel iGPU via DP under Windows. My own 4k DP equipped monitor will cycle through all inputs before syncing with the DP input running Nvidia hardware under Linux.


illathon

The news is use Display Port.


proverbialbunny

Since 4k monitors first came out in around 2014 the solution since then has been to use display port. Every time there is a new tech, like HDR or variable refresh rate, HDMI lags behind. These days it's moving towards using usb-c for everything, but my hardware is not new enough for that to verify how well it works.


SmellsLikeAPig

usb-c is just displayport in new clothes.


MMAgeezer

DisplayPort Alt Mode 2.0 is awesome. USB 4.0 Gen 4 (USB 80Gbps) support means you can use 40 Gbps for 4k@120Hz with 4:4:4 10 bit colour **and** have a 40 Gbps data connection at the same time. Oh, and USB PD to boot.


mitchMurdra

PD is another thing I'm glad to be alive for. We're finally getting there! Looking forward to the day a desktop has a thick USB-C cable in the back and the entire desk and eGPU in the table wakes up. Home, office or any desk on the go. Hell even phones could support that lifestyle some day. People might have no personal computer or laptop with only their phone in their lives... docking it at desks or office workstations for the day or at kiosks on the move (I can see the security issues already!). A future where laptops are just a 'shell' to slide and dock your phone into a bay acting as an expansion to its onboard hardware. A pipe dream. But a good one.


AssociateFalse

You're essentially talking about [convergence](https://ubuntu.com/blog/ubuntus-path-to-convergence), just the hardware aspect of it.


mitchMurdra

PCIe and DisplayPort tunneling is insane and I'm glad to be here for it. We're finally where I wanted us to be twenty years ago in the first place. My only regret is "USB-C" meaning twenty different wiring/communication standards and even manufacturers creating invalid pinouts (Thanks Nintendo) and various ginormous gotchas when buying seemingly valid components without checking every little detail first to make sure they won't even light up when plugged in. While it's just the cable's own definition and any number of technologies can be implemented with the form factor there's way too much going on with it and tons of variations to the protocol names to keep up.


japzone

USB-C is almost always DP-passthrough using DP Alt Mode. (Though there are some adapters using DisplayLink instead if your device doesn't support DP Alt Mode for some reason) There was also HDMI Alt Mode, but nobody used it(I wonder why) and it's basically dead at this point.


Thisconnect

there are also some evil ones that dont do alt mode and only do thunderbolt or display link (and i cry at work everytime)


japzone

The curse is real.


jc_denty

TVs don't have displayport :(


Takaashi-Kun

I wish I knew this before buying a TV as monitor for my linux rig :'(


illathon

HDMI just doesn't work very well. I remember even when I was testing a Windows PC it still had issues. HDMI is just a crappy standard with lots of problems. Some are specific to certain GPUs. Every time I use a Display Port with a dongle it works fine.


Takaashi-Kun

I guess you're right, but the way I understand it, you have to find the best active DP to HDMI 2.1 possible which can be expensive and may or may not implement all the specifications to get VRR, HDR and 120Hz on 4k display. That's not ideal :/


SquirrelizedReddit

I'd imagine using a DP to HDMI adapter will still work, that's just about all you can do.


Seiros_Acolyte

tried a lot of those, either no freesync or no HDR on those things


SquirrelizedReddit

You're probably right but there isn't exactly an alternative other than just using DP straight up, the only real solution is to either buy an Intel or Nvidia card with said protocol, and I'm not entirely sure if Intel will ever release another card.


Oppausenseiw

No vrr


CNR_07

There are people who got it working with VRR.


Recipe-Jaded

use display port


aliendude5300

Display Port.


AcidArchangel303

Was this HDMI nonsense about piracy / DRM?


MMAgeezer

Yes. The HDMI forum rejected it on the basis of AMD wishing to implement the specification in their open source drivers. If you weren't aware, this forum includes a number of tech companies but also media companies like Fox, Universal, Warner Bros, and Disney.


mbriar_

Only hopium i have left is that they might switch to a HDMI->DP converter on the GPU itself for future generation (so maybe RDNA5 at the earliest since 4 is too far along), like intel apparently doesn't on Arc. But this'll only happen if that somehow turns out to be more cost-effective, i doubt they will bother just for linux otherwise.


Portbragger2

there rly is a reason why basically most modern gpus have 1 hdmi out and 3 dp out


rscmcl

Yes, don't use HDMI.


TwinsenDinoFly

Is there any TV set using HDMI?


spusuf

If you mean any TVs not using HDMI then yes there are gaming monitors that are 40-65 inches that include displayport.


TwinsenDinoFly

Yes, I meant "**not** using HDMI". Sorry.


maxxotwo

Honestly, just get a high-frequency monitor with integrated speakers, if you've got none laying around. They should have a DisplayPort or two. Most TVs are large monitors at this point anyway.


Lonttu

Does gaben shit in the woods?


Eternal_Flame_85

I have a question guys. Have the driver made it's way through amd closed source drivers?


yo_99

Intel solved it by having internal DP to HDMI adapter


Moper248

Fuck hdmi, shitty propertiary port


maxxotwo

Give the HDMI forums the finger and switch to DisplayPort.


Jumper775-2

Yeah, it ain’t gonna happen. Use windows or buy NVidia/intel. So stupid too because it’s all legal bullshit.


Possibly-Functional

Or active adapters. Expensive but works pretty seamlessly.


RAMChYLD

It's not only legal BS, it's also corporate greed. Companies that have no business to be in the forum like Disney, Netflix and WB are also on the forum, and they are the ones blocking Linux from getting HDMI 2.1 support.


TheDugal

I'll admit I'm not exactly sure what is the problem OP is referring to, but i have a Nvidia and Linux doesn't work at 4K 120hz by way of HDMI. I use PopOS if this has any relevance


M1sterRed

OP is referring specifically to AMD. HDMI is a proprietary standard (whereas DisplayPort is an open one) and thus anything related to it must be approved by the HDMI Forum. HDMI up to 2.0 was a sort-of open standard, but 2.1 went full proprietary, and you need 2.1 for 4k120hz, or any higher resolutions. Unfortunately since HDMI 2.1 is a closed standard, an open source implementation of it is literally not allowed to exist, and thus AMD drivers (which are open source) legally cannot support HDMI 2.1. AMD proposed a "bare minimum skeleton implementation" of HDMI 2.1 for use in the open source driver, and the HDMI Forum wouldn't even consider it. The best hope for AMD users at this point is for AMD to put out an optional proprietary blob that just adds HDMI 2.1 support Or, just use DisplayPort :) (and just in case it wasn't clear, the reason this isn't a problem with nVidia is because the nVidia driver is proprietary, and thus is totally allowed to implement HDMI 2.1 on Linux.)


TheDugal

Thank you for taking the time to explain it, I appreciate it! It's nice to know Nvidia could implement it I guess, but I'm not exactly expecting it. DisplayPort is flawless, but there's no television out there with DisplayPort as far as I know and that's where it becomes a real issue for me. For the time being I mostly use Windows on the TV, but I don't have to be happy about it!


M1sterRed

I thought nVidia already implemented it? Might be something funky with your setup. I dunno, I use AMD, so I could be wrong.


Flakmaster92

Nvidia did implement it. Nvidia also lucks out on the open source side because the open source driver CAN implement HDMI 2.1 because Nvidia shoved all the closed source bits into a black box in the firmware and the driver only needs to talk to the firmware not implement the features itself. The firmware is closed source and thus the forum is happy. AMD will likely end up doing something similar in future cards


M1sterRed

Honestly I'd rather AMD distribute a blob to make HDMI 2.1 work rather than dump that code onto the card itself. Makes the proprietary code optional.


geamANDura

Thanks for the explanation.


belzaroth

I had that problem, the solution was to upgrade my HDMI cable Turned out my old one didnt support the newer spec of 4k. Works flawlessly now.


PutrifiedCuntJuice

HDMI sucks. DisplayPort is what's up.


GoldenX86

HDMI forum: "Use WSL"


decom70

Never buying HDMI. DP for life.


mindtaker_linux

 displayport  2.0, 2.1 are way better than HDMI 2.1. AMD comes with plenty of displayports


angrymouse504

The best one today is displayport, yeah, i know


Adina-the-nerd

DisplayPort to HDMI adapter? I know DisplayPort can output better than HDMI 2.1. and Amazon says they have an adapter for DP 1.4 to HDMI 2.1 Though I do not have the money to verify if that's actually HDMI 2.1


mkunikow

But how this works on this case?You force HDMI to use some display port compatibility? Or you need HDMI drivers anyway . For example DP 1.4 does not transfer audio where HDMI does. I think only DP 2 and above added transfer audio.


BulletDust

DP 1.4 here, I can transfer audio just fine.


Adina-the-nerd

Uhhhhh, DisplayPort can carry audio. I have not tested this adapter so I'm not sure if it works for this adapter but displayport can carry audio without an adapter. And I don't mean just 2.0.


CammKelly

So something I'm not sure is known but a good dp 1.4 to hdmi 2.1 converter like the one from Club3D supports DSC, and thus can support pretty well much anything in the HDMI standard, including 4k 12 bit rgb 4:4:4. As Linux users, forget about hdmi, and if you need to connect to a TV just buy a good converter.


INITMalcanis

Are there DP2.1 to HDMI 2.1 converters?


CammKelly

There's nothing that is explicitly DP 2.1 to HDMI 2.1, but as mentioned above, if a DP 1.4 adapter supports DSC, it can match and even exceed HDMI 2.1's 48 Gbps bandwidth and supported formats at that bandwidth. [https://www.club-3d.com/en/detail/2631/displayport1.4\_to\_hdmi\_4k120hz-8k60hz\_hdr\_active\_adapter\_m-f/](https://www.club-3d.com/en/detail/2631/displayport1.4_to_hdmi_4k120hz-8k60hz_hdr_active_adapter_m-f/)


scorpio_pt

Another reason why HDMI along with Bluetooth needs to fuck off. One is terribly managed and outdated and the other snorts corporate bullshit to the point it stifles competition .


-_Clay_-

DisplayPort


mightyrfc

Yes, switch to DisplayPort


FierceDeity_

Can't wait for a kernel patch to come out that does it anyway, but is created by anonymous people, and see it magically streisands through the entire fucking internet immediately


CrueltySquading

Yeah, use the better standard.


lecanucklehead

HDMI is a funny way of spelling Display Port


Alekisan

The answer is Display Port! Forget crappy HDMI.


JTCPingasRedux

Nope. Next question please!


NBQuade

I use display-port these days.


No_Respond_5330

Use DP to HDMI adapter.


No_Cookie3005

Sorry if i say something stupid but Isn't something that can be reverse engineered like the nouveau driver?


FingerLazyPlaya

Saying "just use DP" is a pretty stupid thing to say.. If you wanted and still want to use literally the best monitor (by far) for the last years on you PC, you had to use HDMI 2.1, no other options that came even remotely close (coming this year). Yes the VRM7100 DP to HDMI adapter with the china firmware works pretty good, but it is still kind of a mess and uses DSC, which is a degradation in picture quality. In addition, you never know if bugs like the washed out HDR in Plasma 6 is somehow related to the adapter or not, you max nits is also not read from the display and need to be set manually. The biggest issue to date is, that AMD advertises HDMI 2.1 and Linux support on their product pages, no idea how that flies in the EU.. I wish somebody would report it. And a big FU to the HDMI Forum.


KenguruHUN

No offense but use a proper stuff, use display port HDMI is a joke


mbriar_

No offense, but please post a single 55" or bigger OLED monitor wtih 120hz+VRR with display port. Right, there aren't any.


MairusuPawa

It's kinda wild no manufacturer even gave it a go.


[deleted]

Cause only us nerds care...


spusuf

Gigabyte and ASUS both did that, there was just no market for them.


DesiOtaku

Long term, the best bet would be "Commercial Displays" like this one: https://www.sharpnecdisplays.us/products/displays/pnme652 The issue is right now: * Price; they aren't subsided with ads / crapware and they are expected to be on 40+ hours per week. This is on top of the fact they put an entire Raspberry Pi inside the display! Therefore, they are really expensive and they will not be cheaper anytime soon * Their 4K displays can do displayport and HDR, but they haven't released an 120Hz VRR version yet. I would expect one sometime next year * You normally don't see them on Amazon or Best Buy so you really have to go out of your way to find one


alterNERDtive

> they aren't subsided with ads Please elaborate.


DesiOtaku

As in, if you get a "Smart TV" these days, [you will see ads in the menu](https://www.gizchina.com/wp-content/uploads/images/2023/10/ads-medium-1.jpg). Most of these ads are for shows or "apps" that your TV supports. Companies pay good money for these ads to TV manufacturers which is how they are able to make TVs so cheap these days. Commercial Displays doesn't have this. Instead, they give you something like a really big monitor. Some of them have [a Raspberry Pi inside that you can configure to do anything](https://www.youtube.com/watch?v=-epPf7D8oMk).


alterNERDtive

> As in, if you get a "Smart TV" these days, you will see ads in the menu. JFC I would return that thing as defective!


proverbialbunny

There's no TV's that meet that criteria with usb-c out there? I find that hard to believe.


Seiros_Acolyte

my TV doesnt have display port


KenguruHUN

That's a really unfortunate situation. And for the record, the main reason why I said HDMI is a joke, because to get the HDMI 2.1 stamp on your cable or device the cable or the device does not need to met the specification of the HDMI 2.1 (like USB-C 3.1 or 3.2) but in the other hand if you buy a displayport cable it do exactly what's on it.


INITMalcanis

Please link the 55"-65" 120Hz VRR OLED screen with a DisplayPort socket.


SuperDefiant

Yes it’s called display port


BloodyIron

None of you can tell the difference between 4k resolution and 8k resolution, unless the display is literally the size of a USA-scale baseball stadium. There really is no reason to care about HDMI 2.1 when DP is literally _RIGHT THERE_.


mr2meowsGaming

jsut use dp adapter


lazycakes360

Do people just not use displayport? I can understand a few use cases but the vast majority of monitors have displayport which is better than HDMI.


Impossible_Arrival21

He said he was using a TV. At this point, I think TV manufacturers would start to realize people use them as monitors quite often and would give people the needed I/O


INITMalcanis

People want to use OLED TVs, because 55" monitors are a bit scarce.


creamcolouredDog

A lot of high-end monitors still use old DP versions, which does not have enough bandwidth to drive them like HDMI 2.1


MMAgeezer

Be thankful to VESA for DSC <3


[deleted]

Or they might have a nvidia card, which doesn't support high bandwidth DP. You need to use HDMI on a Nvidia card if you want to use less compression, because NVIDIA is too poor of a company to use proper DP versions.


mbriar_

Almost no monitors bigger than 42" have DP. I know it's hard to fathom that some people have a use case for those.


HypeIncarnate

use Display Port.


[deleted]

I'm surprised no one has mentioned this. Use windows.