Timer not precise?

Hi all,

I’m trying to write a clock. I retrieve the atomic time from an http server and then display that time. I have a timer set to go off each 1000ms and that increments the on-screen clock by 1 second.

Problem is, after a couple of hours, the clock is way off. I can only assume it’s because the timer is not firing every 1000ms. Can anyone confirm this? Or suggest a better approach?

Thanks!

[quote=200446:@Peter Michael]Hi all,

I’m trying to write a clock. I retrieve the atomic time from an http server and then display that time. I have a timer set to go off each 1000ms and that increments the on-screen clock by 1 second.

Problem is, after a couple of hours, the clock is way off. I can only assume it’s because the timer is not firing every 1000ms. Can anyone confirm this? Or suggest a better approach?[/quote]

Timers are never quite precise.

You may want to set the timer to 10 ms or so and check in it Microseconds to see when a new second is here to increment your on-screen clock.

Timers are affected by many things… System events that interrupt, hot and cold even can make it change… they are not designed to keep pace with an Atomic Clock (not by a long shot).

What I would suggest is a “feed back clock”

  1. set the computer clock from the Atomic one… by this I mean the Mac/Win computer hardware (which while not as accurate as the Atomic one, is less likely to be altered for software reasons… key word here “LESS LIKELY”)
  2. create you software clock
  3. every few minutes (hours?) compare the time of your software clock to the hardware clock and adjust… hence the “feedback”

I would do this to keep the software and hardware clocks in synch… It would be much easier than attempting to stay in synch with a remote atomic clock

No, the Timer is not precise. The Period is really “no sooner than this value, and as soon as I can after that.”

Timer firings are really “approximate.” Basically, as I understand it, a timer is a method that is basically added to the event loop in the app. Each pass through the event loop, the framework checks to see if the timer is to be fired (i.e.: the amount of time in the interval has passed). If the time is right, then the code in the timer is executed. If not, then the timer code is skipped and the app goes on with the rest of the items to do in the event loop. The next time around the event loop and the same thing happens.

Now, let’s say that at the 900 millisecond point in the timer, your event loop is sees that the timer is not ready to fire. So it passes over and does other things. But say, it then gets stuck doing a calculation that takes 200 milliseconds. It now misses the exact time that your timer is “ready.” So now, it comes around the next time through the loop, sees that the timer is good to go and executes the code.

So I see others have answered as I have been typing. Kim is correct in his definition of “period.”

How about this:

  1. Set a variable to hold the atomic time when you launch the app.
  2. Set a variable to hold Ticks when you get the atomic time
  3. When your timer fires, have it read Ticks, subtract from your variable in (2), and add it to the initial atomic time.

Why use Ticks - why not use microseconds and get even more precision! :slight_smile:

  • Add a StartSecond as Double property to a module
  • In Window Open, or when you reset your clock to the second
    StartSeconds = Microseconds/1000000
  • In a 20 ms multiple timer (about 1/60th a second)

Sub Action() Label1.Text = str(floor((microseconds/1000000)-StartSeconds)) End Sub

For all intents and purposes, Ticks is based on Microseconds as well, and human beings have an eye latency of about 1/15th a second. In the example I just posted, I use 1/60th a second to verify the second elapsed for display.

That said, Dave’s feedback idea is the good one : even computers do not have extremely precise clocks. Maybe use a timer every hour or so to reset the clock with Internet time.

Coming to think of it, it is possible to set a computer to get the Internet time, then it becomes rather precise. Then in a 20 ms timer action :

Sub Action() Dim d as new date Label1.Text = str(d.Second) End Sub

That is even simpler.

If the OP is trying to create a graphic analog clock display, then the amount of processing between frames could be a major percentage (if not multiple) ticks, AND depending on anything the program is meant to do, that frame rate could vary wildly. Which is why I reccommend going back to the hardware clock every once in a while and adjusting (think Leap Second). But you will need to experiment as you don’t want your “second hand” moving gracefully, then all of a sudden JUMPING to catch up.

Thanks guys! I like the idea of using Ticks or Microseconds and will explore this. Ultimately I don’t want this program to affect the system time, otherwise using it to set the OSX/Win clock would probably be a good approach as well.