There is a partial answer on Stack Overflow, but I’m asking something a teeny bit more specific than the answers there.
So… Does the formal semantics (Section 7.2) specify the meaning of such a numeric literal? Does it specify the meaning of numeric operations on the value resulting from interpreting the literal?
If yes, what are the meanings (in English — denotational semantics is all greek characters to me :))?
Reading from the standard:
If the written representation of a number has no exactness prefix, the
constant may be either inexact or exact. It is inexact if it contains
a decimal point, an exponent, or a “#” character in the place of a
digit, otherwise it is exact.
Basically remember they amount to insignificant figures. The compiler isn’t required to have any specific value for them there but it must put something. (most use zero). It’s a way of making a number imprecise.
6