The traditional measurement theory interprets the variance as the dispersion of a measured value, which is actually contrary to a general mathematical concept that the variance of a constant is 0. This paper will fully demonstrate that the variance in measurement theory is actually the evaluation of probability interval of an error instead of the dispersion of a measured value, point out the key point of mistake in the traditional interpretation, and fully interpret a series of changes in conceptual logic and processing method brought about by this new concept.