Precision is the number of digits used to perform a given computation.
The concepts of accuracy and precision are both closely related and often confused.
While the accuracy of a number x is given by the number of significant decimal (or other) digits to the right of the decimal point in x,
the precision of x is the total number of significant decimal (or other) digits.
For a given numeric data type, the value of type:precision (?p) is related to the values of type:maxMantissa (?M) and type:base (?b) by the formula
?p := log(?b, ?M)
where log(a,y) = x iff y=a^x.