14

Over the years of using HDMI monitors, I always had some trouble with the RGB color range, full (0-255) vs limited (16-235).

The HDMI Spec v1.3a says, in section 6.6 (page 95, PDF page 111):

Black and white levels for video components shall be either “Full Range” or “Limited Range.” YCBCR components shall always be Limited Range while RGB components may be either Full Range or Limited Range. While using RGB, Limited Range shall be used for all video formats defined in CEA-861-D, with the exception of VGA (640x480) format, which requires Full Range.

This basically says, "for all TV formats (except for VGA), use limited-range RGB. Other formats are used by computer monitors which we don't care about." In practice this obviously won't work and didn't work because there are too many 1080p computer monitors that expect full-range RGB. And it gets worse in the 4K era.

Eventually I reached the conclusion that I'll never buy another HDMI monitor, but laptop vendors just won't switch to DisplayPort...

That said, at the same time, I was repeatedly surprised by how well those HDMI-DVI adapters (cables) works. The HDMI source (and GPU drivers) seems to be detecting those adapters and notice the sink is actually a monitor and not a TV, and then switch to full-range RGB instead.

How is this implemented in practice (e.g. by AMD and Intel GPU hardware and drivers)?

Related:

How to make a HDMI signal be identified as HDMI instead of DVI?

VESA DisplayPort Interoperability Guideline

ocrdu
  • 8,705
  • 21
  • 30
  • 42
user3528438
  • 707
  • 6
  • 15
  • 6
    "laptop vendors just won't switch to DisplayPort" - they are, but through an Alternate Mode on a USB-C port. – Finbarr Sep 12 '22 at 12:38
  • There are tons of laptops with (mini) DisplayPort. Though I’m not sure it changes much, I wouldn’t be surprised if the video data was exactly the same. – jcaron Sep 13 '22 at 12:24
  • I have a docking station connecting over USB-C to my laptop which has (among other things) 2 DisplayPorts and a HDMI connection. I got another dongle that adds a USB-A and HDMI connector to the other USB-C port. There are plenty of possibilities on laptops, so it's not as bad as you think. Considering size is a factor on laptops, it is common they don't have as much ports as a PC has by default. This does not mean you can't add them. – Mast Sep 13 '22 at 19:37
  • "laptop vendors just won't switch to DisplayPort" – my HP laptop from 2018 has only DP and VGA, but no HDMI; it seems a lot of HP products are DP-only. – user1686 Sep 14 '22 at 05:12

2 Answers2

17

More on Justme's answer:

HDMI 1.3a specs:

8.3.3 DVI/HDMI Device Discrimination

In order to determine if a sink is an HDMI device, an HDMI Source shall check the E-EDID for the presence of an HDMI Vendor Specific Data Block within the first CEA Extension. Any device with an HDMI VSDB of any valid length, containing the IEEE Registration Identifier of 0x000C03, shall be treated as an HDMI device. Any device with an E-EDID that does not contain a CEA Extension or does not contain an HDMI VSDB of any valid length shall be treated by the Source as a DVI device (see Appendix C).

user3528438
  • 707
  • 6
  • 15
13

The adapter does nothing, it just a piece of cable which has HDMI connector on one end and DVI-D on the other end.

It's the monitor that says it's not a HDMI monitor but a DVI monitor.

The PC will read monitor EDID data over the DDC bus, and if the EDID does not indicate that it's a HDMI compatible monitor then it is a DVI monitor. Exact details how it is determined is defined in the HDMI standard and can be read from there as other answers have already quoted.

And in case of being connected to a DVI sink device, the HDMI source device must (as per specs) switch to DVI protocol compatible output with no guard bands and data islands, as HDMI must (as per specs) only be sent to a device which says it supports HDMI, and HDMI protocol requires guard bands and data islands to determine that it's HDMI protocol instead of being DVI.

And the requirement for consumer TV resolutions to be transmitted as limited range RGB (16-240) from a HDMI output was from a time when HDMI was a new standard for consumer video while DVI for computer use already had gained momentum.

Computer graphics used their own established resolutions and consumer/professional video devices used their own established resolutions.

Computers also typically used full range 8-bit RGB (0-255) for graphics while professional video typically used 10-bit limited range for digital component (YCbCr) video, commonly truncated to 8-bit limited range for consumer use.

Early HDMI versions did not support signaling to indicate whether the data sent is limited or full range.

Therefore, it made sense to assume that computer graphics resolutions are sent as full range like before on DVI computer interface, and TV video resolutions are sent as limited range like they were previously sent over digital interfaces such as SDI.

Which also means that a 1080p monitor likely expects full range data from a DVI connector, while it expects limited range data by default from HDMI connector. Modern HDMI monitors may additionally support full range data over HDMI, which PC can detect and optionally use it if display supports.

Also another thing is which standard is used to send the 1080p video. If sent with CEA video timing, that's a limited range consumer format for TVs, but if sent with VESA video timing such as Reduced Blanking, it's no longer a CEA format so full range computer graphics data can be assumed.

So this may not be easy to digest as a consumer - you need to think which cables and connectors to use while connecting equipment, and additionally allow or force full range output on PC, maybe even set the video timing standard.

Justme
  • 127,425
  • 3
  • 97
  • 261
  • 1
    Whether the adapter "does nothing" depends on the adapter. If you look at the [second reference](https://events.static.linuxfound.org/sites/events/files/slides/elce2017.pdf) given in the question, you'll note that there are two kinds of adapters: Type 1 which is indeed passive, but is rated only to 165 MHz, and Type 2 which "[provides] a register indicating the adapter type.... The Intel Linux GPU driver will check for this and prevent using frequencies higher than 165 MHz if a Type 1 is detected. The Intel Apple and Windows GPU drivers do not check for this, so it may or may not work." – cjs Sep 13 '22 at 07:43
  • 3
    @cjs This is about connecting HDMI output to DVI monitor. You talk about DisplayPort to HDMI adapters which are unrelated. Even if you used one to connect to DVI monitor, there will be a single-link DVI connection, and single-link connections are up to 165 MHz, so there is no problem. HDMI output can go up to 340 MHz using HDMI protocol and up to 600 MHz using HDMI 2.0 protocol, and up to 40 Gbps using HDMI 2.1 protocol. – Justme Sep 13 '22 at 08:12
  • You can get adapters with HDMI male & HDMI female connectors, which pass through the video signals directly, but insert a different (often programmable) EDID. This can be useful, for example, to tell the computer "This device is really DVI not HDMI" (or vice versa) when it ends up doing the wrong thing by default. – psmears Sep 13 '22 at 09:18
  • @psmears That does not help. Since the PC has HDMI output, it is mandated by HDMI rules, which means TV resolutions must be output as limited range video levels to a DVI display. Unless forced by user via drivers to output full range regardless. Of course HDMI displays can advertize support for full range and PC can automatically send full range. And you can't expect a DVI monitor to work properly when HDMI protocol is sent to it, and many don't. – Justme Sep 13 '22 at 10:05
  • @Justme: I think you may have misread my comment. It certainly can (and does) help in many situations. Finding the right EDID content to convince the computer to output a suitable signal to make the display behave as you want can certainly be tricky, but is often possible. – psmears Sep 13 '22 at 10:10
  • @psmears I know because I have to deal with weird EDIDs and find workarounds. My point was simply about the claims you mention. You can't input HDMI protocol to a DVI display, and from a HDMI output, you can't get TV-resolutions in DVI protocol with full range values with a custom EDID, unless you force it on the PC. So it does not help in this specific case. – Justme Sep 13 '22 at 10:21
  • @Justme: (1) Note that I specifically said "in many situations" - i.e. I'm not just talking about this one! (2) The thing is, there's not really any such thing as "HDMI protocol" versus "DVI protocol" - there are a range of video formats and ways of encoding them, and there's considerable overlap between HDMI and DVI (that is, one video signal can be compliant with both HDMI and DVI specs at the same time); sometimes telling the source that the display is HDMI/DVI may convince it to do something different, even though the signal it outputs is still compatible with both HDMI/DVI. – psmears Sep 13 '22 at 10:44
  • @psmears You are incorrect. Both are protocols to send video. DVI protocol only supports sending video data periods and control (sync) periods. HDMI protocol adds data island periods used to send metadata and audio. HDMI outputs must use DVI protocol to DVI displays because they can't handle data islands and only support 8-bit RGB format. HDMI outputs must use HDMI protocol tor HDMI monitors and send metadata what kind of format is being sent so how it should be received (RGB vs YCbCr color space, 444/422/420 chroma subsampling, or encoding of 30/36 bits per pixel data to 24-bit link, etc). – Justme Sep 13 '22 at 11:17
  • @Justme: \*sigh\* Yes it's all more complicated than fits in a comment. Yes I know about HDMI data islands, but (a) they're all optional, so may not be sent in all cases, and (b) even when they are, in practice a given DVI display can't tell the difference between a data island and a data error (and in a protocol at multi-gigabit data rates with no error correction, there will be data errors and the display must cope), so in practice DVI-only displays often just ignore them. And HDMI devices must support DVI signals (it's in the HDMI spec). So in real life it works better than you might think. – psmears Sep 13 '22 at 13:57
  • 1
    Yes there are more even more nuances to this (eg life might be easier if all devices kept to the specs, or behaved consistently and sensibly, but in practice they don't!). My claim was that it can be _useful_ to insert a different EDID, _for example_ lying about the HDMI vs DVI status or indeed perhaps something else. This is certainly true because I've done it (both ways), on a number of occasions, and it was indeed very useful :) Your experience may be different (there is huge variety in support among devices), but saying that "this is _never_ useful" is too strong a claim. – psmears Sep 13 '22 at 13:58
  • @psmears Yes, our mileage will vary, depending on how deep you are designing video equipment. But HDMI data islands are not optional, they are mandatory in HDMI protocol, because sending no data islands means it's DVI protocol. You can read the linked HDMI standard if you want. And while altering EDID contents is useful, it's the specific example that is poor - you definitely run into problems by forcing HDMI signal into DVI chips that get confused by them, but it might be useful to trick DVI out into HDMI input. – Justme Sep 13 '22 at 17:15
  • @Justme: I'm not sure what to say. The data islands _are_ optional--read the spec carefully!. (Again, simplifying here due to lack of space) In theory _if_ a source is capable of sending them (not all have to be), it must always, but since sinks must cope with DVI signals, sources can get away with being out of spec on that. One use of forcing HDMI is if you want audio: dupe the signal to (a) an audio processor (so you get sound) and (b) a DVI projector (for picture). As I've said - in practice DVI devices will ignore data islands (treating them as datastream errors). Works like a charm :) – psmears Sep 13 '22 at 19:57
  • @psmears I've read the spec for years now and they still say "at least one Data Island shall be transmitted during every two video fields" for the signal to be HDMI. HDMI also has guard bands which DVI does not have so they must be removed for DVI protocol outptut. A typical DVI receiver just does not like HDMI protocol while some might. Bear in mind that DVI input might be implemented with HDMI receiver chip with DVI EDID so obviously it can accept HDMI protocol. Many signal splitters will can also take HDMI in and output HDMI to amp and DVI to projector. – Justme Sep 13 '22 at 21:32