Even in small projects, when you do a change, you don't recompile anything. In fact, the build is often performed in two steps: compilation and linking. If you modify two files which don't impact anything else in the code base, then only those two files are compiled (usually in parallel!), and the linker does what it needs to do with them.
Moreover, large projects are usually split into several components: shared objects (.so files) in Linux, or dynamic-link libraries (.dll files) in Windows. Each component can be complied independently, i.e. you can change a component and recompile it, without the need to recompile anything that depends on this component (unless you changed the interface).
Note that when working on a large code base, more often than not you don't even need most of the code. If, for instance, you need to change the internals of a class in order to make it faster, there are chances that all you would need is the class itself, and its unit tests; that's all. Your iteration would therefore be still a matter of seconds: edit, compile, ensure the tests are still green.
Only when you think you finished that you will check if you didn't break the class in a context of its integration. This involves not only compiling and linking, but also running integration tests, stress tests, load tests, and system tests. That's usually a lot of stuff for a developer machine even for a small scale project, and is often dedicated to the build servers. Essentially, you commit your changes, and the build is triggered, running a lot of stuff on a cluster of machines. How much machines depends on the scale of the project, and the budget. It may be just one or two servers, or it may be a farm containing hundreds or thousands of servers running the tests in parallel.