A global control variable. This variable controls the number of digits to generate when converting floating-point values to strings. It defaults to 12. 17 digits is "perfect" for IEEE floating-point in that it allows double-precision values to be converted to strings and back to binary with no loss of information. However, using 17 digits prevents any rounding, which produces longer, less intuitive results. For example, expr 1.4 returns 1.3999999999999999 with tcl_precision set to 17, vs. 1.4 if tcl_precision is 12. All interpreters in a process share a single tcl_precision value: changing it in one interpreter will affect all other interpreters as well. However, safe interpreters are not allowed to modify the variable. (from: TclHelp)