I'm designing a switched-mode power supply. I need to give my users the ability to set a voltage setpoint, with a range of approximately 200-800V, and at least 4V of resolution. This needs to be a hardware solution that feeds into my microcontroller. Cheap and simple and hard for the user to screw up are the goals. My predecessors have used potentiometers, which make it impossible to know what you've actually set without running the system. I don't have a display to work with, just a couple blinky lights. Not really the most effective way to identify a voltage like this.
I'm thinking of using rotary switches. Three rotary switches, and an appropriate combination of resistors, should let me translate the setting of the switches to an analog voltage. In my ideal world, 000 translates to no voltage, 999 translates to 3.3V, and every setting in between scales linearly. Alternately, I could use twelve digital inputs to read the same information, but I'll need an I/O expander for that.
I'm sure I'm not the first person to consider something like this. Is there a canonical way of doing this? If I go the analog route, how many resistors am I going to need? How many different values? Or is there an obviously better way to address this problem?