I reconstructed a signal (from simulations) in the time domain from t ~ 0.01 to 2000 s. I do this by calculating "modes" or constants for the system that I'm studying that should be exact and then constructing the time-domain signal from a formula, with a sample rate for the time-domain signal of 10 kHz (I chose 10 kHz somewhat arbitrarily since it corresponds to a time far less than the shortest mode). I then wish to take the FFT.
The signal is not bandlimited. I don't think the signal in the time-domain should have any aliasing. To take the FFT, I sampled up to the Nyquist frequency of 5 kHz, but I understand that since the signal isn't bandlimited, I probably cannot do this. Also, there are features at higher frequencies (within half the Nyquist or so) that don't make sense physically, but I have no clear a priori knowledge of where the cutoff frequency / passband should be.
My understanding is that I can / should apply an anti-aliasing filter when reconstructing the original signal in the time domain and then take the FFT. Can somebody please clarify my thinking on this? My background is in physics not signal processing, and I feel that I'm getting a lot of these concepts muddied.