I was having a look at the HDMI pinout and I thought: why would they use I\$^2\$C for the display-host communication? My question here is about the design metrics that lead to this choice.
HDMI is a quite recent standard while I\$^2\$C is around since 1982. I\$^2\$C is meant for on board, chip to chip communication, moreover the standard allows multiple devices attached to the same bus. An HDMI cable can be long some 15m, so the I\$^2\$C signal should probably use higher than normal voltages to avoid too much noise, adding the necessity of tranceivers on both sides. About the multi device thing, I can't really think how you would attach more than one monitor to a single HDMI port unless you are being very, very non standard.
I'm really not an expert in communication protocols but I think that RS485, CAN or some other point to point, full duplex, higher SNR protocol would have been better.
So why would they choose I\$^2\$C?
note: I know this might be marked as "opinion based", I am hoping that somebody around can think of/knows about some objective reasons.