# How safe SecondsFrom1970 with Doubles?

I have found `Xojo.Core.Date.Now.SecondsFrom1970` to be really really useful. If this value is assigned to a `Double` it’s easy to test for time elapsed. The usage for the next 100 years doesn’t strike me as being very large values and I’m only after millisecond precision at this stage. However I have heard bad things about the accuracy of doubles. How does this effect the use of `SecondsFrom1970` if at all? It looks pretty good to me as `Double` precision is supposed to be to 15 decimal places?

SecondsFrom1970 will not have any decimals, if I understand correctly.

I doubt you will have any trouble unless you are looking at a time span in excess of 3.5 million years

``` MsgBox(Xojo.Core.Date.Now.SecondsFrom1970.ToText)``` provides six decimal places which is microsoeconds I believe but on my system (Linux Web) only the three most significant decimals have non-zero values, indicating milisecond resolution. However the integer component has 10 digits, meaning the whole number has 16 digits in total with only 13 returning values on my system.

I have not had any trouble in my testing at all but since my app relies on this rather heavily I want to be sure from a theoretical standpoint too so I know I’m on sure ground, Where does the 3.5 million year figure come from regarding millisecond accuracy?

Doubles are accurate to 15 digits …

Is that digits or decimal places? And if it is decimal places or digits, the Xojo documentation about Equals has an example with 16 decimal places and 17 digits. On the other hand, if it’s digits, there are 13 digits with values in them from what is being returned. So I think it’s safe for millisecond resolution either way. However I’m after confirmation/clarification since what is said about `Doubles` is not all that clear to me.

Basically digits as a double is represented as a mantissa + exponent
See https://en.wikipedia.org/wiki/IEEE_754 and https://en.wikipedia.org/wiki/Double-precision_floating-point_format

Thanks Norman, my reading of the the Wikipedia article says 64-bit Doubles have 15.95 decimal digit accuracy so that’s fine for millisecond resolution.

But I would like to see what 0.95 of a digit looks like - perhaps a few pixels off the top?