Java
With respect to Java, your premise is completely flawed: you never have to specify that a function is virtual--all functions are virtual by default. Java does provide final
to specify that a method can't be overridden, which has a somewhat similar effect, but isn't really identical. To the extent that final
means "not virtual", it's still done in the opposite direction--it's basically and "opt-out" rather than "opt-in" situation.
C++
One of the things Bjarne has always emphasized about how to design objects is establishing and maintaining invariants. Some of those invariants have to do with values, such as "this value must be between 1 and 100" or "this value must be between 2 and 3 times that value", etc.
It is equally important, however, to establish and maintain invariants with respect to an object's behavior. These invariants establish a framework within which specific parts can vary. Short of truly ugly programming, it allows the programmer to specify and enforce certain behaviors via the compiler's type checking mechanism.
For example, let's consider a C++ vector. The C++ standard guarantees that a C++ vector provides amortized constant growth. If we allowed a vector's growth function to be overridden, a class derived from vector
could violate that constraint--but since it's (hypothetically) publicly derived from vector
, it can be used any place a vector is needed, including code that may depend upon that constraint being enforced.
Making functions virtual is also oriented toward using inheritance. Bjarne has (ever so gently) pointed out for years that many programmers overuse inheritance. It's perfectly fine to design classes that don't need or use virtual functions at all1.
So no, this decision in C++ isn't entirely (or even primarily) to avoid the overhead of calling a virtual function. Mostly it's about basic design, and objects enforcing invariants. Many of those invariants are on an object's data, but some are also on an object's behavior. Given the importance of enforcing invariants, functions should be non-virtual by default, so the invariants are enforced by default, and behavior is open to modification only where intended.
1. For example, consider his 2003 interview with Bill Venners.