1

I am trying to produce a ~30 μs delay on a 5 V pulse with less than 1 ns jitter. What would be the best way to accomplish this (if any)?

My basic requirements:

  • Delay must be known to the ns
  • Low temperature drift (preferably less than 1 ns/1°C, but at least known to the ns/°C, or some way to keep the temperature constant)
  • Very low jitter (<1 ns)
  • Logic high 5 V
  • Functional at frequencies up to 10 kHz
ocrdu
  • 8,705
  • 21
  • 30
  • 42
  • The exact delay time isn't critical, just the jitter? Is very low frequency drift with temperature etc acceptable? – pipe Sep 22 '22 at 22:30
  • 3
    The best way is undefined. You likely have some limits in which the task must be done. You likely get answers which you say you can't accept because of your unknown limitations. – Justme Sep 22 '22 at 22:31
  • @pipe sorry, I should have listed my requirements. I have updated with what I need. Thank you. – Karsten Schnier Sep 22 '22 at 22:38
  • what circuitry is the 5V pulse coming from? does it have its own clock? – Miron Sep 22 '22 at 22:53
  • @Miron it's coming from a laser driver internal photodiode. It is a one time pulse and there is no externally accessible clock. – Karsten Schnier Sep 22 '22 at 22:58
  • Use a >1Gsps DSO with a trigger output, set the trigger to occur on the rising edge of your pulse with a 30us delay. I haven't done this, but it sounds feasible - comments? – Tesla23 Sep 23 '22 at 00:17
  • 2
    When I did optical/laser timing work, we would buy digital delay generators from SRS. They're not very expensive as far as test equipment goes, especially second hand. – user1850479 Sep 23 '22 at 00:42
  • @KarstenSchnier Perhaps study [this](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8749538/)? – jonk Sep 23 '22 at 03:35
  • Maybe some kind of programmable logic. – user57037 Sep 23 '22 at 04:31
  • I am pretty sure I saw a detailed schematic for a pulse generator a long time ago. Maybe a Stanford Research model. But I don't find it now. It was something like using a capacitor ramp rate to store the time prior to a trigger, then adding the reciprocal of it on to the end. In this way a digital pulse could be fine-tuned with a small extra bit of time. It struck me that getting it to work well would be challenging. – user57037 Sep 23 '22 at 05:34
  • Can you use a long length of optical fibre? – D Duck Sep 23 '22 at 08:44
  • becker and hickl. They do Digital Delay Generators. But not to the spec required. https://www.becker-hickl.com/products/ddg-210-digital-pulse-generator/ – D Duck Sep 23 '22 at 08:50
  • @user1850479 I understand that SRS has a policy of not supporting their products except for the original owner. Do you know if that is true? – Spehro Pefhany Sep 23 '22 at 14:37
  • You might look at https://electronics.stackexchange.com/questions/220559/counter-for-20-ghz-clock – user69795 Sep 23 '22 at 15:02

8 Answers8

2

You mentioned that you're using a laser and photodiode. Fiber optic cables will have essentially zero propagation delay jitter, so an optical delay line is a very good solution. A general rule of thumb for typical networking-grade optical fibers is a propagation delay of 5ns/m, or 200m/µs. Based on this, for 30µs you'd need around 6000m of optical fiber.

You could buy a 10km reel and get it cut and terminated, but you can just buy singlemode fiber in 5km and 1km spools, then couple them together. With a decent coupler you should see very little additional insertion loss. The reels are not physically large.

You can buy these fiber reels for far less than the cost of the test equipment you would typically need to buy to achieve the jitter spec you need. If you went with cheaper brands you could probably pull this off for just a few hundred USD, although you might need higher grade fiber if you need to minimise losses due to the source laser not being particularly powerful. Even then, I wouldn't expect the cost of fiber to exceed 1000USD.

If the length delay period needs to be fairly exact, rather than just roughly 30µs, you might want to buy a 10km reel, measure what the total propagation delay is, then calculate the required fiber length and get it cut and terminated to match your needs. Given that 1cm of fiber represents approximately 50 picoseconds of delay, if you give yourself some margin and take a few passes at cutting and re-terminating the fiber you should be able to dial in that 30µs delay to within a couple hundred picoseconds by hand.

You can also buy off-the-shelf fiber delay lines for this exact purpose. They're used by networking equipment manufacturers to test long distance fiber runs in the lab. Far more expensive than just buying a reel and DIY'ing it, but they do come in a nice rackmount chassis and you might be able to find something inexpensive second hand.

If your laser isn't powerful enough to propagate through this much fiber, you could look at hooking up your drive pulse to a network transceiver module. The interface is differential and fairly simple to use. On paper, the worst-case delay jitter in such a setup is one unit interval (UI) at the transmitter - i.e. if your pulse arrives just after the start of an interval, plus (see note below) the maximum transmission jitter (typically 0.35UI or less), plus the max receiver jitter (again typically 0.35UI or less). For a 10GBASE-LR transceiver (capable of 10Gbps over 10km) a unit interval is 100ps, so based on the specs you would expect to see a worst case of 170ps 70ps of jitter. However, I've never actually tried to measure the real-world jitter in a transmitter and receiver pair, so you'd have to try this out yourself. Lucky for you, 10GBASE-LR transceivers are ubiquitous and cheap in the second hand market, so it's not an expensive experiment to try.

note: I got the jitter wrong initially (thanks to @dotwaffle on Twitter for pointing this out) - 10G modules are asynchronous and the SERDES is on the linecard not the module, so the total jitter is simply the sum of the two modules' jitter numbers. it's also worth noting that, clock & data recovery and dispersion compensation are also on the linecard in SFP+ which might complicate things a little; you'll need to experiment with the modules to see if you can use them as a transparent interface.

Polynomial
  • 10,562
  • 5
  • 47
  • 88
1

A long time ago I worked on a core timing module for various RF stuff. We fed a low jitter 125 MHz clock to an FPGA. The FPGA turned on various RF switches (at just the right time) and controlled two Analog Devices DDSs. I am not sure the timing outputs had less than 1 ns jitter, but I suspect they did.

I think you could feed a fast, low-jitter clock to a programmable logic device of some sort. Make it do whatever it is you want to do. Registering the output to your low-jitter clock may be a key part of the puzzle to minimize jitter.

user57037
  • 28,915
  • 1
  • 28
  • 81
0

how low of a jitter? I would suggest a small 8 pin microcontroller like an AtTiny85 being driven by a crystal oscillator. Set an interrupt on the pin connected to the trigger signal. In the crystal's datasheet you will find the accuracy of the crystal, usually as PPM.

If you need something more accurate, oven controlled crystal oscillators (OCXO) are used to get rid of temperature variations.

If you need something less accurate, the AtTiny85 will also work off it's internal RC oscillator.

Miron
  • 789
  • 4
  • 10
  • 4
    Unless you're running the ATTiny faster than 1 GHz this will have a lot more than 1 ns jitter. The oscillator doesn't matter, because it will be asynchronous with the trigger. – pipe Sep 22 '22 at 22:42
  • yeah, good point – Miron Sep 22 '22 at 22:45
  • Some MCU from STM have high resolution timers with 217 ps resolution so this idea might work, for example: https://www.st.com/en/microcontrollers-microprocessors/stm32f334r8.html – Rokta Sep 23 '22 at 06:58
0

There are certainly complete instruments that will do this, and although they're not going to be remotely cheap it might be a solution if you have funding. I'd look at Highland Technology and Berkeley Nucleonics for starters.

Otherwise, maybe an MCU with a relatively high frequency timer clock (say 250MHz, so probably 500MHz internal clock) and use tapped delay lines to split the 4ns resolution into pieces on input and output. For example, on the input trigger you read n delayed bits from a tapped delay line to determine how late the trigger was relative to the input signal and then shorten the output delay by a similar amount of time (doing some calculation and setup during the ample ~30 microsecond pulse length).

The Vdd would have to be well regulated and decoupled and the MCU capable of configuration for very fast output transitions to minimize jitter. And of course the firmware would have to be 'bare metal'. And you'd certainly want a way to test the results to specification.

Spehro Pefhany
  • 376,485
  • 21
  • 320
  • 842
0

If you have an oscillator (or MCU with hardware interrupts) -- say 20 MHz, you can generate the bulk of the 30 us with XTAL accuracy. Note that the delay between your (asynchronous) input and the start of your clock will be 0 to 1 oscillator cycle. Similarly a delay needs to be added between the last (counted) oscillator cycle and your final output.

With 20 MHz, TOSC = 1/f = 50 ns.

If you then build a circuit that measures the delay between your input signal and the next oscillator edge (only has to be accurate to 1 ns/(1/20MHz) = 2 %, you need to add 50 ns minus this delay to the end.

You can measure the first delay with an R.C circuit and S/H. You can generate the 2nd delay with simple R.C timer circuits and a logic gate, but this will need some calibration (for logic gate delay etc.).

In the end, you will have 3 elements:

  1. the first fraction of a cycle delay = DELAY1
  2. 'N' counts of the oscillator (just less than 30 us)
  3. TOSC-DELAY1 at the end.

This will give N+1 oscillator cycles, synchronized to your input. Obviously a higher oscillator frequency will correspondingly simplify the analog R.C portions.

Otherwise the way to do this is to use a DLL to subdivide the oscillator (DLL might have 50 delay stages tuned to give 50 ns total delay), and digitally select the correct stage to generate signals. This would be quite difficult to do with discrete logic ICs, but an FPGA might be able to with care.

jp314
  • 18,395
  • 17
  • 46
  • 1
    I looked at the jitter on an ADSP-21xx DSP running on a very high quality TXCO giving it an 80 ns cycle. The clock to clock jitter at one of its output pins (that tracked the input clock) was a lot more than 1 ns. In no possible way would one be able to get near 1 ns jitter using that system. I feel the OP is asking for a great deal. – jonk Sep 23 '22 at 02:18
0

Using a simple function generator to create 3us pulses at about 8kHz, I just set up a Siglent DSO running at 2Gsps to generate a trigger 30us after an edge (Menu: Trigger/Type/Dropout). Using another scope, I can see the trigger output giving me a pulse 30us after the edge. As this is a trigger output, it should be stable to the sample rate (2Gsps so 0.5ns), but I haven't checked it. The trigger level and delay are simply set from the front panel.The pulse width you can detect depends on the bandwidth of the scope.

I haven't checked the jitter.

So it's possible that the machine you are using to see the pulse can actually generate the delayed signal.

Note that I'm assuming that the delay is created by counting samples, I would expect this to be true, but...

Tesla23
  • 2,928
  • 1
  • 3
  • 9
0

Probably a ASIC can produce a pulse with less than 1ns jitter. Many MCUs can produce delay much as 30μs micro seconds which can be software controlled but jitter of the signals is somewhat unpredictable as interrupts in the MCU can cause jitter in signals generated.You might also have clock jitter. Since almost all signals are generated from internal clock, there would be some jitter in generated waveforms. A Complete Hardware solution with a ASIC or FPGA might be the best option.

Amit M
  • 394
  • 1
  • 11
0

I would use an FPGA, clocked from a fairly high frequency like 250 MHz, with an ADC and DAC clocked at the same rate.

The pulse would enter a Bessel anti-alias filter which would give it a well-defined rise time, and allow the ADC plus some DSP to interpolate the input pulse position to sub-nanosecond.

A certain number of clock cycles later, the DAC would output a filtered transition, which would allow an anti-alias filter plus comparator to produce a square output pulse with sub-nanosecond precision. Here is a patent that describes how to do the DAC output step.

Neil_UK
  • 158,152
  • 3
  • 173
  • 387