Question:
Imagine a transmitter + transmitter antenna with the following characteristics:
- 433Mhz transmitter transmitting 10dBm
- Into an impedance matched (50 ohms) antenna, omni-directional, 3dBi gain
And a receiver:
- 50 ohm impedance antenna, receiver circuitry presents 50 ohms
- Receiving antenna is an identical 3dBi omni-directional antenna.
How would I go about calculating the voltage the receiver circuitry would see? I'm trying to learn RF electronics by building a detector for a constant 433.92Mhz carrier, and I can't predict how my diode will behave without knowing the voltage applied across it.
My attempt from first principles:
10 dBm is 10mW of power, 3dBi of gain means 10mW X 2 = 20mW radiated from the antenna under ideal conditions.
Based on the inverse square law, we get:
$$ P = { 0.02 \over 4 \Pi r ^2 } $$
$$ P = { 0.02 \over 4 \Pi 40 ^2 } $$
$$ P = { 0.02 \over 4 \Pi 40 ^2 } $$
$$ P = 9.94718394^{-7} $$
Yipes that seems small.
This is the bit where I get stuck. Can I just multiply it by the receiving gain (3dBi, basically 2x) and sub it into ohms law (with the impedance of the antenna, 50 ohms) combined with the power formula (P = IV)?
$$ P = ({V \over R}) V $$
$$ 2 \times 9.94718394^{-7} = ({V \over 50}) V $$
After doubling the power (because the receiving antenna gives me 3dBi gain) and plugging this formula into Wolfram Alpha to solve, I get ~10mV.
This seems low and I'm not confident with the logic of my derivation. Is this correct?