I have an easy question that I feel I might be overlooking.
The question I was given was:
A 15V power supply is used to power a 6V, 500mA device. The device draws 500mA at the rated voltage, but will operate properly with +/- 20% voltage variation. Design the appropriate voltage divider. Note that the load current requirement is variable and not always 500mA (edited again for clarity on what I was told about the question).
simulate this circuit – Schematic created using CircuitLab
My question is: What do I take into consideration when designing the circuit?
Do I design the circuit such that the "Ohm's Law \$V_t/R_t = I\$ current" is at least 500mA and also make sure that the device receives 4.8 - 7.2V at the same time by choosing the right resistor values?
Or, can I design the circuit (ignoring the current, but making sure the overall current in the circuit is greater that 500mA) taking just voltage in consideration? By that I mean I assume that at 500mA in the circuit, there will be a voltage drop of so and so much based on the resistor values?
If this is the case, do I ignore the "Ohm's Law \$V_t/R_t = I\$ current" ?
As a follow up question: Do I have to consider the "unloaded voltage divider equation" i.e. \$V_{out} = V_{in} \left(\frac{R2}{R1+R2}\right)\$ or can I just work with the voltage drops at 100mA and 500mA, making sure that the voltage drop on \$R2 / RL\$ is at between 4.8 and 7.2V?