It's not a completely trivial question. Measurement of noise power is more difficult than measurement of tones.
Apparent noise power is affected by:
- Detector (if any) type and its response
- Non-linearities in the system (log amplifier in a spectrum analyser)
- Bandwidth of the filter used (noise equivalent bandwidth is not the same as -3 dB bandwidth)
For example - here's a diagram of one channel, and an adjacent one (dashed). The scalloping loss may not be equivalent to the adjacent channel leakage.

So when converting between "noise seen in a certain 124 Hz channel", and "noise power density in \$\text{dBm}/{\text{Hz}}\$ or \$V/{\sqrt{Hz}}\$ you might need to correct for more than just the bandwidth. This is the case whether this is an FFT channel or a spectrum analyser setting.
If you're using a spectrum analyser, it will have a mode called "marker noise" which compensates as best as possible for these effects, to give you the effective noise power per Hz in a signal-free channel. If you're doing it yourself, I recommend reading Agilent application note 1303 Spectrum Analyzer Measurements and Noise: Measuring Noise and Noise-like Digital Communications Signals with a Spectrum Analyzer
This application note has some great background to noise measurement in different bandwidths. It's aimed at spectrum analysers but many of the principles are useful to other measurement systems.
Found all over the web, but not on the official site as far as I can see. Here is one copy.