For reference im using an AVR Atmega 328p, and the servo specs i'll be using says this:
- Pulse Cycle: 20 ms (I assume this means the "period")
- Pulse Width: 600-2400 µs
So from my understanding, using fast PWM and lets say a 1Mhz clock. I make the Width by setting 2 values. One when it reaches TOP-whatever width (lets pretend 1800 ms, I think people use ICR1 for this) and then the top (I think they usually use OCR1A here).
So the time between ICR1 and OCR1A (Which is the top) is where I turn the "Pin" to the Servo on correct? And then it turns off when OCR1A ends
IE: Turn on when it reaches ICR1, turn off when it reaches OCR1A. If I want a width of 1.8 ms i'd need to make that 1800 steps correct? Since 1800 "ticks" would be 1.8ms (If Im understanding my units correct)
The pulse cycle is where I get confused. That would just be the "TOP" value right? IE: if I need a 20ms pulse cycle. and every tick at 1MHz is 1/1000 of a milliseconds, so to do 20ms pulse cycle, ill need 20000 ticks. because every tick is 1/1000 of a ms
Am I thinking of this correctly? I think the "Ticks" with the CPU Frequency and then trying to get that figured out with cycles and widths is just sort of throwing me for a loop.