I bought a new power supply. It is possible to set both current and voltage on the unit. I have previously been using a transformer with fixed voltage modes at 3, 4.5, 6, 7.5, 9.12 volts. The current is rated at 2000 mA, 24 VA (max). I have not cared so much about the current but now when I can set the current, I get a bit confused about proper settings.
How should I think about how to spec how many amperes I should feed?
I have been thinking that since the current is regulated by the load, i.e., that if I supply 12 V to something, the current is regulated by itself based on the load and I do not have to think so much about the current more then not exceeding what is capable of delivering. For example a charger to a phone, a charger with 2 amps charges your phone faster than one with 0.5 amps. And it is the phone that determines how much power it should draw right?
So how carefully do I need set the current? Can I think of the current settings as the maximum number of amperes it is capable of delivering or do I need to be more accurate than that. (Assume so otherwise you would have no need for specifying it except for maximum value as a reference point, in my case around 18 amps).
Another example: If I feed a driver A4988 to a stepper motor from the unit, there is a current limiter in the driver so I assume that it is not strange to set manually as in this description:
https://ardufocus.com/howto/a4988-motor-current-tuning/
Another example ... if I supply a Raspberry Pi with 5 V and 3 A (same as the official adapter). Is 3 A max what RPi can deliver and if I, based on Ohm's law, connect loads with more than 1.66 ohms (15/3), can it break? But it still can not deliver more 3 A?