I have zero experience with antennas and little experience with electronics. I need to detect weak radio signals in the range ~100MHz - 1.3GHz and analyze them with an oscilloscope.
I bought a large, broad-band log-periodic dipole antenna. I also have an excellent oscilloscope that has more than sufficient bandwidth and sampling rate for such signals. The antenna is intended to be mounted on a roof and does not include a mast or any way to mount it (I guess the manufacturer assumes you to have these things.) I only want to use it indoors and would prefer to be able to move it out of the way often, since its quite bulky. Do I need amplification or additional grounding?
What I did so far:
The antenna has an N-type connector output. I used a BNC adapter and directly connected the antenna (which was just lying on a table) to the oscilloscope. This seemed to work: I saw some signal (microvolts) and could also identify some data packets from surrounding communication devices, indicating that the antenna is working. The low signal amplitude is not a huge problem for me, although I'm still thinking about amplifying it somehow.
However, after some time, I felt some static electricity shocks when touching the BNC connectors. This made me worried about damaging the oscilloscope.
Why does the antenna charge up?
The radio signals should not contribute to this, right? Do I need to ground the antenna somehow? Tutorials on installing TV antennas mandate to ground separately the mast to which the antenna is fixed.
However, this seems to be a precaution for lightning, which is not relevant in my case.
Is it OK to just connect the signal directly to my oscilloscope?