Specific to a Digital Multi-Meter (DMM): the type of Analog-Digital Converter (ADC) used inside a typical DMM, is called a dual-slope integrating ADC. This technology has been around since the 1970s, take a look at the Intersil 7106. I previously wrote about how this device works here.
But about your question, which is basically how can we trust that the numbers reported by the DMM are accurate...
An instrument manufacturer such as Fluke will publish a user manual which describes how to use the instrument, and defines how accurate the instrument can be (when it is properly calibrated). Separately, they also publish a service manual for use by third-party calibration service providers, detailing exactly what instruments and calibration standards are required, and exactly what procedures must be used, to achieve the designed performance of the Unit Under Test.
I can't find the URL at the moment, but here's an excerpt from a service manual that I had handy, just to show an example of the type of information provided to a company that would perform the calibration service:


It goes on like this for quite a while, with step-by-step instructions on what plugs into where and which buttons to press, as well there are also specifications for having the equipment 'soak' at a specified temperature range prior to calibration (to avoid temperature-dependent inaccuracy).
Note that in this example, even when the instrument is provided with a best-possible input reference of 30.000V, the instrument is only expected to display any number in the range of 29.992V to 30.008V, any reported value in that range is considered close enough.
Each part of the instrument is calibrated in a certain order, such as first basic 2V measurement offset/gain/linearity, then the 200mV and 20V ranges which depend on the 2V measurement, and only then move on to current measurement which depends on voltage measurement of a known resistor. The procedure can be done manually if you have all the right gear, and if all of that gear has itself been recently calibrated so that it too is trustworthy.
The analog semiconductor company that I work for periodically sends our lab equipment out to a third-party calibration vendor who has all this certified-calibrated-standard gear, and runs through all the procedures for us. It only costs money... But my own personal DMMs that are 'for indication only', 'not calibrated', I don't bother sending them out, I just accept whatever level of uncertainty the users manual says it is good for. So if my 3.3V supply measures 3.29V or 3.32V, I don't worry about it, it's within report tolerance and probably right on.
There's an important principle in Statistical Process Control where trying to make small adjustments that are less than the system's standard deviation, will actually make it less accurate than leaving it alone... this is why target shooters and archers always first try to get a tight cluster, before adjusting their aim. Same with instrument calibration. Making a small adjustment to favor the 30.000V test point, will affect everything else, so they can only adjust it to within a certain range before it negatively impacts the overall accuracy of the system.