Accuracy is the degree to which a given quantity is correct and free from error.
For example, a quantity specified as 100+/-1 has an (absolute) accuracy of +/-1 (meaning its true value can fall in the range 99-101).
While a quantity specified as 100+/-2% has a (relative) accuracy of +/-2% (meaning its true value can fall in the range 98-102).
The concepts of accuracy and precision are both closely related and often confused.
While the accuracy of a number x is given by the number of significant decimal (or other) digits to the right of the decimal point in x,
the precision of x is the total number of significant decimal (or other) digits.