The general idea itself is simple - code that has remained untouched for an extended period of time and now does not compile/run without some effort. The question of "how" is a little more complicated.
Given a piece of code and a fixed environment, the system's behavior will not change. This determinism appeals to our general sense of logic.
The issue is that our environments are never static. We are running newer compilers, different IDEs, servers have been upgraded and OS patches have been issued. Generally speaking, even the hardware has changed. Seldom do these differences seem big. We accumulate them, one by one and then realize that the new environment does not still work with the old code.
The crux of the matter is, of course, that this is a gradual process that we do not really take note of while the code was, in some sense, out of sight. This is what leads to the feeling that the code rotted. Like some leftovers, we left it unattended in the refrigerator and now the whole container is spoiled.
It may be fair to say "environment drift" is a more precise term for what happened in this situation than "code rot", but the latter more specifically captures how it feels to the system maintainers. Hence the popularity of the term.