why does a single band-limited signal in frequency have an infinite time domain
The assumption here is that if the time was limited it might have a discontinuity and thus an infinite rise time if the samples are ideal.
But this is not the case in real systems with BW limited, so it is assumed to be "steady-state".
Therefore the normal case when a system is analyzed with time-boundaries and bandwidth limits we ignore any discontinuities at the end of the "steady-state"
This is analogous to a simple low pass filter with a unit step function applied. In theory the step can be infinite or finite rise time and the exponential never reaches the unit voltage, but in practical terms with tolerances, the experiment duration can be stopped at 10 T=10RC.
At this point=10T, the residual error is about 144 PPM and the dV/dt has reduced the risetime to and spectrum BW or the peak risetime t=0.115% so could be captured with high accuracy with ~142x the -3dB BW.
So in theory yes, you cannot have simultaneous limited time and spectral Fourier BW but if you have an error tolerance, you can have both.