Accuracy Versus Precision

In a recent elevator conversation with the team, we stumbled on a side conversation about the difference between accuracy and precision. I've always used them defined this way:

  • Accuracy - The nearness of a value to the "real" value
  • Precision - The resolution of a measurement

It turns out this is not entirely correct and these definitions, while technically accurate in certain fields, are not universally held to be true. In fact, the above definition of precision is almost completely wrong for most other engineering and scientific disciplines. A more accurate (see what I did there?) set of definitions would be something like:

  • Accuracy - The nearness of a value to the "real" value
  • Precision - The probability of repeated measurement yielding the same result
  • Measurement Resolution - The resolution of a measurement
Source: Wikipedia

Comments

Popular posts from this blog

Please use ANSI-92 SQL Join Syntax

The difference between Scalability, Performance, Efficiency, and Concurrency explained

the myth of asynchronous JDBC