17

I was having a look at the HDMI pinout and I thought: why would they use I\$^2\$C for the display-host communication? My question here is about the design metrics that lead to this choice.

HDMI is a quite recent standard while I\$^2\$C is around since 1982. I\$^2\$C is meant for on board, chip to chip communication, moreover the standard allows multiple devices attached to the same bus. An HDMI cable can be long some 15m, so the I\$^2\$C signal should probably use higher than normal voltages to avoid too much noise, adding the necessity of tranceivers on both sides. About the multi device thing, I can't really think how you would attach more than one monitor to a single HDMI port unless you are being very, very non standard.

I'm really not an expert in communication protocols but I think that RS485, CAN or some other point to point, full duplex, higher SNR protocol would have been better.

So why would they choose I\$^2\$C?

note: I know this might be marked as "opinion based", I am hoping that somebody around can think of/knows about some objective reasons.

Vladimir Cravero
  • 16,007
  • 2
  • 38
  • 71
  • +1 for great question! I think that's relate to CEC! I use STM32 and they have a CEC peripheral and I'm eager to know the answer. – Roh Aug 04 '14 at 13:23
  • 2
    I served on some VESA panels as a standards rep from a semi-company (VGA) when the DDC2 was being implemented. Philips was able to negotiate to get their standard implemented, which was little contentious as it was a proprietary solution, although it is a a good solution for plug and play. So @TurboJ has the right answer. At the time multi-drop was not considered important as it was point to point analog (VGA). – placeholder Aug 04 '14 at 14:58

3 Answers3

11

DDC history in HDMI goes via DVI all the way down to VGA. It is implemented in a way that you can simply hook up a standard I²C EEPROM memory chip on the monitor side, which are almost as cheap as dirt (AT24C01 and compatible).

I2C signal should probably use higher than normal voltages to avoid too much noise

Nope. The +5 Volts tell you a different story. What they might do is a lower clock frequency on the bus. HDMI cables are usually shielded well, too.

So why would they choose I2C?

It was there in DVI (which HDMI is compatible to) and works and is cheap.

NStorm
  • 1,889
  • 12
  • 23
Turbo J
  • 9,969
  • 1
  • 20
  • 28
  • 3
    So in summary, you're saying it's due to legacy compatibility issues and works fine so why change it? – horta Aug 04 '14 at 13:59
3

I2C is very inexpensive and simple to implement for a number of reasons. It is often used when just a few bytes need to be transferred. It is also a very structured interface, with protocol defined for who should be talking at a given time. I2C, due to its age, is also well supported among I2C manufacturers (hence why it's inexpensive and simple to implement). Due to the slow data rate, SNR is really not an issue and 3.3V is a typical bus voltage and it can be heavily low-pass filtered, if necessary.

I think it's important to point out HOW the I2C would be used in a monitor. Not only would the I2C would allow communication to multiple monitors, but to multiple devices (e.g. multiple ICs) within each monitor, although there is likely a separate I2C bus for each HDMI cable in most host systems. The I2C interface would likely be used to establish the connection with the host, where the host would query the monitor to find out things like its resolution, frame rate, manufacturer, name, and probably other things. I2C would not be fast enough to transfer image and sound data, that information goes through the TDMS wires, which will be high speed and low-SNR.

kjgregory
  • 1,030
  • 2
  • 8
  • 18
  • So you are saying that on a multi hdmi setup only one i2c transceiver is required host side, and that's why multi point comm is a nice thing to have? – Vladimir Cravero Aug 04 '14 at 14:00
  • You wouldn't even need a dedicated transceiver (as in a single IC whos sole function is to communicate over I2C). It could just be one minor responsibility of a bridge IC that is managing a variety of different interfaces. It is likely that there is a dedicated I2C bus for each monitor, however. One of the downfalls of I2C (IMO) is that no two slaves can be configured with the same bus address and there is no protocol (that I'm aware of) for dynamically assigning new addresses to slaves. – kjgregory Aug 04 '14 at 14:16
  • Yep that was my point, moreover I'm guessing that two identical monitors have the same address so you will need separate lines anyway. – Vladimir Cravero Aug 04 '14 at 14:17
  • 1
    I don't think that fact is really a big issue or counter-argument to its use in HDMI. Especially when you consider that pretty much any other protocol would require a separate interface for each monitor anyway. – kjgregory Aug 04 '14 at 14:24
  • Yeah I agree on that – Vladimir Cravero Aug 04 '14 at 14:25
0

Its cheap, it works, it was already there from the VGA era, and there was no real reason to change it.

Good engineering in the consumer space is cheap and works well enough (Which HDMI mostly does), nobody wins points for designing something in that space that uses extra chips, has serious comms overhead and supports complex multidrop topologies for something like this.

The chip is read once at link bringup, so even if you can only clock the thing at KHz rates, that is a non problem for the hundred bytes or so of data. CAN or RS485 would both have required more doings in a very cost constrained consumer application.

I suspect the DDC stuff was imported wholesale without even a lot of thought, as in fact was most of video timing (Displayport and HDMI are pretty much electrically identical), and the video timing can be easily traced at least as far back as composite video on CRTs, front porch, active video, back porch, retrace interval.... It looks very familiar to any old school TV guy.

This actually is a somewhat rare case of a standards body NOT making changes to remove one manufacturers advantage, and instead going with a known to work defacto standard. I would not have been surprised by I2C but with the bus pulled down and the active state being logic 1, or something equally asinine just to avoid handing Phillips/NXP/Nexperia an advantage!

Dan Mills
  • 17,266
  • 1
  • 20
  • 38