Many FPGAs have phase-locked loops which can multiply the frequency of a clock. The signal path of a PLL is rather simple:
Phase detector -> Averaging Filter (LPF) -> Oscillator
Where the oscillator is fed back to the phase detector.
The PLL circuits have a minimum frequency requirement for the clock. The requirement is often high, in the several MHz range. I am wondering how crucial satisfying the requirement is, and where it is derived from.
Question:
- Technically, what causes the minimum requirement?
- How far out of spec (clock frequency below the requirement) would the PLL still be expected to function, and what does the possible error look like?