I'm working with an audio ADC that supports a max audio input of 500mVRMS before it will start clipping. This is sufficient for lots of equipment that does a 1VPP/.35mVRMS line level, but I'm finding that lots of equipment like computers, and MP3 players and things will use up to a 3.5VPP/1.25VRMS output voltage. I need the input to work with both the high and low voltage signals. I am considering simply cutting the input signal to roughly 1/3 so that any of these signals will fall under the 500mVRMS level, but then my SNR will be worse on signals actually coming in at 1VPP/.35VRMS because I'll have to amplify it much more to get a good volume on the output. I've tried it and it works, but it may not be ideal.
Is this my best option, or is there a more effective way of reducing the higher voltage inputs to prevent clipping before it gets to the ADC?
Note: Changing the ADC would really not be ideal, so solutions other than that would be particularly helpful.