I am currently studying The Art of Electronics, third edition, by Horowitz and Hill. Exercise 1.6 says the following:
C. Power in resistors
The power dissipated by a resistor (or any other device) is \$P = IV\$. Using Ohm’s law, you can get the equivalent forms \$P = I^2 R\$ and \$P = V^2 / R\$.
Exercise 1.6. Optional exercise: New York City requires about \$10^{10}\$ watts of electrical power, at 115 volts (this is plausible: 10 million people averaging 1 kilowatt each). A heavy power cable might be an inch in diameter. Let’s calculate what will happen if we try to supply the power through a cable 1 foot in diameter made of pure copper. Its resistance is \$0.05 \ \mu \Omega\$ (\$5 \times 10^{−8}\$ ohms) per foot. Calculate (a) the power lost per foot from “\$I^2R\$ losses," ...
So we have that \$P = \dfrac{I^2}{5 \times 10^{-8} \ \Omega} \$, which means that we need to find the current \$ I \$. I sought to use Ohm's law: \$I = \dfrac{115 \ \text{V}}{5 \times 10^{-8} \ \Omega} = 2.3 \times 10^{9}\$. Have I done this correctly? If not, then why is this incorrect, and what is the correct way to do this?