0

I'm making a lidar module as a project at my university. The deliverable should be a module like YDLIDAR X4 Lidar or tf-mini lidar.

I watched a video from GreatScott in which he disassembled a YLIDAR X4 lidar. The microcontroller was a STM32F302x6. I went through the datasheet and found it has a maximum of 72 MHz clock.

Through a simple calculation light travels at 3*10^8 m/s that means to measure time of flight of 0.5 meter between the object meaning 1 meter round trip it needs to be at least 300 MHz, but it can measure 30 cm. How is that possible or what did I get wrong and what microcontroller clock do I need as a minimum so I can at least match these specifications?

Edit 1: It seems that the Microcontroller doesn't do the measuring of the time of flight, a specialaized ic is used that then interfaces with the Microcontroller.

In addition I didn't want to use FPGA becuase I know Nothing About it and it will take a lot of time to learn and use for a project like this.

AhmedH2O
  • 287
  • 1
  • 11
  • 1
    Can you elaborate on why you've chosen to use a microcontroller *alone* (and predicated the design on its clock) without considering high-speed circuits such as ToF signal processing ICs, FPGAs, or similar? (Please do so by [edit]ing your post, since responses in comments can get lost) – nanofarad Mar 25 '22 at 14:34
  • You can't treat LIDAR like SONAR. The MCU wouldn't handle the timing. A LIDAR has dedicated timing circuitry. If an MCU could handle the timing you would see a LOT more LIDARs. If this is your capstone you may want to reconsider the project. – DKNguyen Mar 25 '22 at 14:41
  • 1
    Also, read your own device description. The X4 uses triangulation, not time of flight. That you could do for a capstone. – DKNguyen Mar 25 '22 at 14:43
  • 1
    It's probably only relevant to discuss the details of the specific sensor used. This is kind of the same thing as "how can you have a serial bus baudrate of 8MHz when you clock the MCU with 4MHz". Because... unrelated hardware. – Lundin Mar 25 '22 at 14:45
  • 1
    I once built an lidar system with a resolution of 1 micron (optical bandwidth: ~300 THz). I had a pentium 4, which in those days I think ran at about 3 GHz, so we had 1 CPU cycle per 100,000 optical cycles. Fortunately, an interferometer does not care how fast the CPU that reads out the interference fringe is, and the low CPU speed mainly limited our update rate. – user1850479 Mar 25 '22 at 15:36

1 Answers1

8

The microcontroller is not directly used to time the laser transmissions and returns, but rather is used for digital processing of data obtained from faster dedicated circuitry.

The key building block of time-of-flight detection is a so-called "Time-to-digital converter" (or TDC); an example is TI's TDC7200. A TDC can measure times much more precisely than your microcontroller will be able to (either in pure software or using one of its timer peripherals) - in fact, the example TDC I linked offers picosecond-level precision when properly powered, configured, and integrated into a design. To achieve these specs with such a chip, one must carefully implement precise reference clocks, good power supplies, good PCB layout best practices, etc.

With the proper set of TDCs and laser drivers connected to the optics of your LIDAR, your microcontroller can initiate pulses, digitally read the return times, and perform any digital signal processing required. This TI application note describes an example application of TDCs and other parts to LIDAR, and is much more useful/precise than a teardown video on Youtube.

Note that time-of-flight is not the only approach; it's possible to use mixers and self-mix between outgoing and incoming modulated signals to detect reflection phase for various modulating frequencies, and there are approaches using optical inteferometry, as a comment points out.

nanofarad
  • 18,062
  • 2
  • 47
  • 73