The displayport protocol runs at a fixed frequency of either 1.62GHz, 2.7GHz, or 5.4GHz. The pixel stream (strm_clk) it carries runs at an arbitrary frequency and is likely to be asynchronous to the link clock (link_clk). The receiver is supposed to recover strm_clk from two numbers M,N, as their ratio, i.e., strm_clk/link_clk = M/N, where N is fixed to 32,768 and M depends on how many strm_clk cycles have been recorded during N link_clk cycles. In asynchronous mode, we would expect M to slightly vary in time. So far so good.
Now, the problems:
M,N are sent once per frame, whereas they are reset every N=32,768 link_clk cycles which is in about 2-3 lines. How can the delicate changes in M be detected by the receiver who doesn't see M as often (only once in millions link_clk - during the vertical blanking)?
the lowest 8 bits of M are sent every line - which may explain the above, but since M is reset every N=32,768 link_clk cycles, so are its lower 8 bits, which will undermine the ability to use these bits as a reference between two clocks (due to what appears as sudden resets on these bits). How does that work - are there two separate counters?
another (minor) issue: in every stream clock, 24 bits are sent (in RGB 444 mode), the displayport sends either 4 or 8 bytes (depending on the number of lanes / their speed). Does that mean that in every lstrm_clk cycle, M should increment in a fractional manner (i.e., by 3, by 3, then by 2)?
I find the standard hard to understand - basically, page 60 here:
http://read.pudn.com/downloads98/sourcecode/others/400577/DportV1.pdf
Bottomline: can anyone explain how this N,M scheme is supposed to work given the issues above?
Any thought would be highly-appreciated!