Jitter can be thought of as phase/frequency modulation of the signal. In the frequency domain, modulation creates sidebands around the carrier. Thus, if you observe a jittered signal on a spectrum analyzer, you'll observe the carrier's frequency, and it's upper and lower sideband components of the jitter. Its important to remember that jitter occurs at different rates (frequencies). You can think of this as "how slowly or rapidly is the ideal edge position being moved from its ideal location in time". Different jitter 'frequencies' will result in frequency domain sidebands at different offset frequencies (offset from carrier frequency).
Jitter is often spec'd in the time domain. Phase noise is spec'd as the 'spot magnitude, of the sideband at a specific offset frequency. For example, you might see a spec like -90dBc/Hz at 10kHz offset. This says that the sideband level is 90dB down from the carrier magnitude when measured at 10kHz away from the carrier frequency, normalized to a 1Hz measurement BW.
So, you can't directly compare a dBc/Hz level with a jitter spec. The only way to relate them is to consider the complete phase noise characteristic in the frequency domain, and integrate the total amount of power in the phase noise sidebands. It is then possible to relate this total integrated phase noise power to the jitter value.