Assume you bought the meter 10 years ago, or calibrated it 10 years ago, then you get a year of measurements within spec, or the meter is broken.
After 1 year and 1 day? The Manufacturer makes no claim. If you want to claim a spec and can support it, go ahead. But it's on you.
IF you measure the same thing with the same meter for 10 years, without re-calibrating, and then re-calibrate and measure again, then you've got a one-point study of long term drift. Don't forget to include the long term drift of whatever you're measuring. You could look at the data in the uncalibrated interval and draw conclusions about it. But that's your calibration, in the interval, not the manufacturer's calibration.
Re-calibrate the meter after 10 years, measurements will be within spec. for one year, again. Drift over the 10 year period isn't an issue. If you measure with a calibrated meter 10 years ago and measure with a calibrated meter, today, each measurement will be within spec and its LIKELY any difference in the measurements is less than the maximum allowed, in opposite directions. But the maximum allowed, in opposite direction, is the worst possible case.
IF you use a meter that was calibrated once, for 10 years, without re-calibrating, measuring various values, then you've got 9 years of data from an uncalibrated meter. It may be better than random numbers. To know how much better, you need to measure references to establish accuracy now, or re-calibrate and repeat prior measurements to characterize repeatability, allowing for source drift. Either way, accuracy in the uncalibrated interval on your shoulders.
The specifications quoted are really good. If you expect to realize that performance, you have to maintain calibration. If you want a hobbyist quick check, short the two inputs together. That better be 0.0000 volts, 0.0000 amps and 0.0000 ohms. Beyond that, you need a voltage reference, a current reference and a resistance reference. A low drift resistor is not an unreasonable thing for a lab, but why not just get the meter calibrated, or learn to calibrate it yourself, at that point? Before you start shopping for voltage and current standards that are 2-10 times better than the meter spec. They aren't cheap, and they have calibration requirements themselves!