In a 12V DC system, given a particular maximum current and length of wire, what is the best method to determine the minimum width (gauge) of that wire? In particular, what method is best for high current levels and short lengths of wire?
The typical way to do this seems to be to calculate the width of wire necessary to achieve a given voltage drop, e.g. 2%, perhaps using an online tool. However, if I understand correctly, the % voltage drop is not actually what is relevant in terms of fire safety, but rather the number of watts of dissipated heat due to that voltage drop per unit length of wire. Running 200A over 1 foot of 10 AWG wire will result in roughly the same % voltage drop as will running 20A over 10 feet of the same wire, but in the first case there will be much more heat generated per foot of wire, so presumably there will be a higher risk of fire hazard and melted insulation.
Is it safe to assume that a 2% voltage drop will not create a fire hazard even for high levels of current over short lengths of wire?