3

In many datasheets clock tolerance is in ppm and in some other it is in ns or ps. What is the difference in giving clock tolerance in ppm and ns/ps. How it can be converted from one unit to another?

4 Answers4

5

How it can be converted from one unit to another?

You can't. The two specs are about completely different things.

When frequency tolerance is say 100ppm, then your 1MHz clock will have a frequency in the 1 000 000 Hz +/- 100 Hz.

This says nothing about jitter. Frequency is only the average number of clock cycles in a second. Jitter is about the variance of the clock period. So, if your clock period varies randomly by Tj=10ns RMS (for example) but the period is still 1µs on average, then you can have a very accurate (but jittery) 1MHz clock.

EDIT: ...however if the accuracy is given in µs/minute (for example) then it is accuracy and not jitter. The way to convert it is quite simple...

\$ ppm = 10^6 \frac{drift}{interval} \$

so, 1µs drift over 1 minute = \$ 1µs/60s\$ = 0.016ppm (that's gotta be expensive...) it doesn't depend on the frequency, since it is a drift measured in time, over a time interval. If you multiply both by the frequency, then it is a drift measured in number of periods, over an interval which is also measured in number of periods. Frequency appears both as numerator and denominator, so it disappears.

Now about jitter, since I dont know whether you're talking about a jitter spec or a drift spec... The ppm spec is useful when you are interested about frequency accuracy. Jitter spec (or more accurately phase noise) is about spectral purity which is very important if you use the frequency as a RF carrier, also it is has a strong influence on ADC/DAC noise floors, etc. For example in data transmission (like USB) frequency accuracy is not important since the receiver synchronizes itself to the transmitter by using a PLL, but jitter in the recovered clock is extremely important, as you want to sample the received bits at the proper time.

bobflux
  • 70,433
  • 3
  • 83
  • 203
1

Devil's in the details.

Just on the face of it you can't. But if you know the mean value you can. ppm is simply "Parts Per Million", which works exactly like a percentage.

One part per million (1 ppm) means 1/1000000th, or 1% is 10000 ppm.

If the time-base is one milisecond (1 kHz) and the jitter is one ppm, the jitter will be one millionth of 1 ms, or 1 ns.

Asmyldof
  • 18,299
  • 2
  • 33
  • 53
0

ppm indicates the accuracy, that is, the deviation from the nominal value. On the other hand, ns/ps indicates how many seconds is the error per day/month/year...
So you could compare ns/ps and the worst case in terms of ppm. A 20MHz clock with 20ppm will work between 20M(1-20/1000000) and 20M(1+20/1000000) Hz and it will have an error (worst case) of 86400*(20/1000000)=1.728 seconds per day.

Uwe
  • 2,783
  • 1
  • 11
  • 18
Jose
  • 201
  • 2
  • 6
-1

Time Jitter = Vnoise / SlewRate

If you need more details, just whistle (from Casablanca)

analogsystemsrf
  • 33,703
  • 2
  • 18
  • 46