0

One method to design a controller for a digital control system is to first design it in the s domain and then convert it. However, choosing a suitable sampling period is important.

In most problems I've faced up to this point I first find the smallest time constant c of my system. Then I choose the sampling period so that 0.1c< T <0.5c.

I just came across a problem in which the transfer function has two poles at the origin and I can't find a time constant. How should I choose my sampling period?

John Katsantas
  • 831
  • 4
  • 14
  • 30

1 Answers1

0

Usually I run the A/D obviously much faster then needed, then low pass filter that stream of samples. That reduces the random and quantization noise. It also alows for a simple analog anti-aliasing filter. The analogfilter only needs to squash frequencies above half the sample rate, not half of the decimated rate you may use later.

Then unless you're doing signal processing, it's usually more useful to decide the low pass filter response by looking at the step response in the time domain. Usually you know how much of a delay you can tolerated from when the system does something until you react to it in the micro. I usually use multiple poles of a simple IIR filter. I go into more detail here.

Olin Lathrop
  • 310,974
  • 36
  • 428
  • 915