If you charge the capacitor to some level and then connect it in parallel with the resistor, a current will begin to flow.
In reality this current will become smaller as the capacitor discharges (and the voltage across it therefore drops), but if we imagine that we somehow forced the current to stay at the initial magnitude through the resistor until the capacitor was fully discharged then it would take a certain amount of time until the capacitor was discharged to 0 V.
It turns out that this "certain amount of time" is the same no matter how much or little you charged the capacitor originally. (If you charge it more, there will be more charge to discharge, but the current will be proportionally higher because higher charge produces more voltage). This time is the product of the capacitance with the resistance -- or in other words your time constant.
And that is, intuitively, why the time constant has units of time.
(Alternatively, the time constant is how long it will take for the voltage to drop to \$\frac1e\$ of its original value, in the more realistic situation where we leave the system alone and let the current drop with the voltage as per Ohm's Law).