A opto-isolator is appropriate, but no, you don't just wire 120 V into one. The input of a opto-isolator is just a LED, or sometimes two LEDs in parallel with opposite polarity. The LED usually emits IR, so drops around 1.2 V and can handle maybe up to a few 10s of mA. The output is usually just a phototransistor that allows current to flow thru it when it receives light from the LED.
Since this power is low frequency, you don't need fast operation and can use relatively little forward current. Let's say 2 mA peak thru the LED is enough. You can easily find optos that have a current transfer ratio (how much current the output transistor can pass divided by how much current you run thru the LED) of 1 or more. That means the output transistor tied between ground and a 10 kΩ pullup will produce a good enough digital signal.
The peak voltage of a 120 V RMS sine wave is 170 V. A 82 kΩ resistor in series with the LED will light it well enough in that case. It should also be rated for at least 200 V. The LED can't handle 170 V in reverse, so you can put a ordinary diode rated for the voltage in series with it, like a 1N4004. That also cuts down on the power dissipation in the resistor since it is only conducting half the time. In this example the resistor only dissipates 90 mW with the diode in series. The limiting factor for the resistor will be its voltage standoff capability.
There are various tricks to reduce power consumption, like using a capacitive voltage divider before the resistor. If 90 mW is OK, then I'd just use the resistor and diode.