Jump to content

Artificial precision

From Wikipedia, the free encyclopedia

In numerical mathematics, artificial precision is a source of error that occurs when a numerical value or semantic is expressed with more precision than was initially provided from measurement or user input. For example, a person enters their birthday as the date 1984-01-01 but it is stored in a database as 1984-01-01T00:00:00Z which introduces the artificial precision of the hour, minute, and second they were born, and may even affect the date, depending on the user's actual place of birth. This is also an example of false precision, which is artificial precision specifically of numerical quantities or measures.

See also

[edit]

References

[edit]
  • Smith, N. J. J. (2008). "Worldly Vagueness and Semantic Indeterminacy". Vagueness and Degrees of Truth. pp. 277–316. doi:10.1093/acprof:oso/9780199233007.003.0007. ISBN 9780199233007.