This seems to be an issue of terminology surrounding the use of the term "real time".
Real-time clock
A real-time clock is a device for stable/accurate (within some tolerance) timekeeping, so that the host system can use it to associate events/actions with the time and date of occurrence.
You can think of a real-time clock as analogous to the innards of a digital watch interfaced to the computer. It has an independently powered time reference designed to be stable and reasonably accurate. Like a digital watch, it won't lose track of the current time just because the host computer was shut down. Real-time clocks have been fitted to computers mostly as a convenience so that the user doesn't have to re-enter the current time and date every time the system is started, or make frequent adjustments to compensate for drift.
The alternative to a real-time clock would be to use software and internal timers driven by the system clock. Such an approach is workable (the original IBM PC worked that way), but is not particularly stable; it will also lose track of the date/time at any point the operating system is shut down, hangs, or crashes.
Real-time system
When the term "real-time" is applied to a computer system or application, it describes a system that responds to real-world events in a very short, deterministic amount of time - often just a few milliseconds, sometimes less, with defined ordering of simultaneous inputs. Real-time systems are used for such things as machine control - robotics, simulations, and games. Although a real-time application may make use of current time and date information, an application isn't "real-time" just because it makes use of the current time and date.
Real-time clocks vs high resolution timers
As stated above, the purpose of a real-time clock is to reliably keep track of the current date and time, generally only to the second; a good one will have minimal drift (seconds gained or lost each day). Real-time clocks generally don't have high resolution; their base clocks often run quite slowly in comparison to modern CPU clocks; this is to minimize power consumption (drain on its independent power source) so that the clock will continue to reliably keep time if the host computer is powered off for an extended period.
A high-resolution timer isn't concerned with the current time or date; its purpose is to measure time intervals at some precision, perhaps microseconds or even less. To accomplish this, it must be based off a stable, high frequency clock - typically the computer system clock. High-resolution timers are also not typically concerned with drift over long durations, because the usual purpose is time measurement over short durations. High-resolution timers don't have the same power consumption concern as real-time clocks because they don't have a job to do while the host computer is powered off.