1)
You might be thinking about amperage in the wrong way. You can do current (amp) limiting techniques if needed, but for the most part power sources are designed as either fixed voltage or fixed current. Fixed current is less common and should be obvious, it tries to maintain the same current output by varying the voltage. However, voltage power supplies (a lot more common), try to maintain the same voltage by varying the current. This means that if you have a 120 volt device that will try to draw a varying amount of power (watts), this will be done by the current draw increasing or decreasing. The 20 amp rating in a voltage supply, is the maximum amount of current draw permissible before damage or fire. In the case you try to draw too much, you will usually run into some form of overcurrent protection (like breakers or fuses). However, if there is no overcurrent protection, fire may result.
2)
This is complex and involves many topics. However, since your house is AC (alternating current), you usually will have some device stepping down the voltage to something it can handle. This is usually done with a transformer, which operates on induction. A transformer can increase or decrease voltage (step up or down) by decreasing or increasing the current, respectively... usually at acceptable efficiencies, though that can vary by a lot of factors. After the step down stage, you will get a rectification stage where the AC voltage is turned into DC (direct current), which is what most devices use internally.
3)
As explained in #1, since the amperage rating of a voltage supply is the maximum rating of current that can be drawn, then you would be fine as the voltage output is the same and the new supply meets or exceeds the previous. Though for obvious reasons, it is much preferred for you to at least have a vague idea of the requirements of the device. Sometime under-voltage is just as bad as over-voltage.