if they wanted the sampling frequency to be 10Hz, why did they not just sample at 10Hz initially?
In order to avoid aliasing, the signal has to be lowpass-filtered before sampling. No frequencies above Fs/2 should be present in the analog signal (or, realistically, they should be attenuated enough to be buried in the noise, or to a level low enough to meet the specifications you want).
If you sample at Fs=10Hz and want to acquire say, 4Hz signals, your filter will need to let them through, yet provide strong attenuation above 5Hz, so it will need a flat transfer function in the passband, then a steep fall-off after the cutoff frequency.
These high-order filters are difficult and expensive to implement in the analog domain, but very simple to do in the digital domain. Digital filters are also very accurate, the cutoff frequency does not depend on the tolerance of capacitors for example.
Thus, it is much cheaper to use a low-order analog lowpass, oversample by a large factor, then use a sharp digital filter to downsample to the final sample rate you actually want.
The same digital hardware can be used for several channels too. At this low sampling frequency, the computing power requirements are very low, and a modern microcontroller will easily implement many channels of digital filtering at a very cheap price.