2

I want to increase the effective resolution of the ADC by oversampling and decimation. Unfortunately the signal I'm realing is too clean, so I would like to add a bit of artificial noise (1LSB peak to peak) to a signal.

I would like to use MCU timer to output a square wave, convert it to triangle-like wave and add it to the signal.

Below is my attempt, but it does not work as expected - the amount of noise added varies with the level of the analog signal.

Can someone enlighten me on this topic? How do I do this properly?

schematic

simulate this circuit – Schematic created using CircuitLab

miceuz
  • 5,523
  • 3
  • 39
  • 49
  • Just tying the PWM pin to ground through a capacitor didn't add enough noise? – Ignacio Vazquez-Abrams Apr 26 '14 at 18:25
  • " the amount of noise added varies with the level of the analog signal." Is it possible you are running into a floor effect? What's the relationship between the A/D range and the range of V2? – gwideman Apr 26 '14 at 18:59
  • 1
    @gwideman It was due to unbuffered analog signal, I've used a pot when breadboarding the circuit, when I've added an opamp buffer, everythig started to work. – miceuz Apr 26 '14 at 20:07

1 Answers1

4

It won't vary with input signal level the way you've shown it, however it will vary with impedance of the source.

I suggest adding the noise with an op-amp to isolate the input from the noise source. You should probably have an anti-alias filter on the input signal before adding (unless it's naturally band limited) and make sure that the input is not correlated with the triangle wave.

Spehro Pefhany
  • 376,485
  • 21
  • 320
  • 842