One view of an ideal current source is that it "enforces" a current, and there is no limit to the voltage that the current source will apply to make that current flow. There actually is a device that approximates this behavior very well, even if only for very short times: the humble inductor.
As a reminder, the formula linking voltage and current of an inductor is V = L * dI/dt
, or using words: the change in inductor current is proportional to the timespan and the voltage across the inductor and inversely proportional to the inductance. If our inductor is "very big" and the timespan we look at is "very small", then the current is practically constant, even for large voltages.
Contrary to other current sources, the magnetic field inside an inductor will make sure that the current does not change, and it is able to generate extreme voltages to ensure that current flowing. For example, if you "charge" an inductor with a low voltage (e.g. 12V) and suddenly break the circuit, the voltage across the inductor can easily increase by several orders of magnitude (e.g. 5kV) - just to force that current flowing. In reality, you always have some parasitic capacitance which is able to absorb this current - or you have a spark, which is what happens when the inductor is so adamant about making sure its current flows that it forces the air to become a conductor.
This principle is used in every step-up converter: the higher output voltage is caused by an inductor whose current path (to ground) was interrupted; it then forces the current to flow into the output capacitor, even if that capacitor is at a higher voltage than the input.
To summarize: for very short periods of time, an inductor behaves very much like an ideal current source.