I am trying to understand fully the problem with using electrically short antenna to transmit RF signals (i.e length significantly shorter than the intended wavelength).
Currently my understanding is that such an antenna does not radiate : since the length of the antenna is much shorter than the wavelength the voltage distribution on the antenna at any given moment is roughly uniform. In particular there is no voltage fluctuation on the antenna and hence no magnetic field and hence no radiation.
I have been trying to quantify this explanation. The simplest quantification is that the impedance of an electrically short antenna is large, so the power transmitted to the antenna is small. But let's suppose I match the input impedance with the antenna impedance and I crank up the voltage to such an extent that the power transmitted to the antenna is now large. Now even the small variation in the roughly uniform voltage distribution are magnified and a magnetic field appear.
I am wondering if the high impedance is really the only issue that prevents an electrically short antenna from radiating? (Indeed, the antenna now behaves like an RC circuit, but I don't see why this is a problem if for example R = 50, C = 10k and the frequency is in the 10Mhz range and the voltage is very large).