You are right, circuit are seldom powered with AC. Some analog circuit may work on AC, but digital doesn't. This is due to the very nature of how it has been design and how it has proven to be efficient and easy to implement. There are system which uses AC signal to encode data on some very old systems, but it requires lots of high power components, which is not practical with current technology. An example would be old casino slot machines which used an analog signal to make the reels turn and the coin dispenser drop coins one by one. Also, some mechanical counters used AC signal toggling to count.
Transistors as they are now are designed to be efficient with DC signals (if you consider digital data as DC, since some purist would say it actually is offset AC). It is what is the most power efficient and space-wise efficient in integrated circuits. Also, encoding data on AC signal (while possible) is much more complex than with DC signal, since you can use a simple edge detection mechanism. In order to do so with AC, you need to use peek detectors or circuits like that, which is not practical.
Now if you want to understand why technology moved toward DC, consider this: imagine you are trying to turn on/off complex circuits with DC, it is easy, because your signal has only two states: full on and full off. With AC, you would have (as a guess): an AC signal or no signal. Relays could be compatible with your AC signal, but they work simply because of magnetic field. In order for your relay to remain firmly on, you would need to rectify your signal (so it doesn't swing) thus yielding rectified AC...
In short, it is both due to historical and technological reasons.