Timer.period setting for highest resolution

I have just ported my Xojo application from OSX to Windows. It is quite a demanding application, with various services meant to run at typical video frame rates. On OSX I used a service period of 5 milliseconds. When I moved the app to Windows, everything was running slow. I did a measurement and found the service period was at 15 ms. With a little research I learned that the Windows minimum timer tick period is 15.625 ms, ie 1/64 second. This is fine, the application can be run at that rate without issue.

But the Xojo timer period property is an integer. If period=15, will this cause a timer event on every Windows timer tick?

Also: If period=16, will there periodically be a dropped Action event?

The answers depend on how the Xojo developers decided to deal with the rounding error.
TIA

Hello Tom,

There are a few points to consider on Windows, which are quite different than programming on OSX. With Xojo on OSX it seems that the timer works well. On Windows OS the standard is a tight loop using a GetMessage declare for event-based applications and PeekMessage for gaming-based applications. For some reason the timer on Xojo Windows is not designed for gaming and I haven’t been able to figure out the way to think in OSX while programming a game in Windows (maybe someone has?).

Another helpful point that was brought up by @Norman Palardy a long time ago was to have the drawing commands executed within the Paint event, which makes the movement of object much smoother. Again, the way that drawing smooth movement is performed on Windows is different than OSX, as the time for drawing needs to have a factor created. The general philosophy is Windows OS gives you all of the commands and you need to figure it out (kinda like Xojo), whereas OSX tends to try and package commands to make it easier for a beginner programmer to learn the language - where both OS’s have their own benefits in different areas. There may be some #If TargetWindows needed to change this design philosophy on a Mac when compared to Windows.

The timer will work well for very simple animation, and will cause issues with complicated graphics on Windows OS and typically works well on OSX. My answer would be to try it out :slight_smile:

This setting tells the timer when to fire, which is different than how long it has actually taken to run the code (difference between OSX timer and Windows tight-loop again). On Windows there is the need to determine how long the drawing and running code has actually taken place before firing the timer for the next frame. Timing on OSX in Xojo seems to have been compensated for this frame timing. My guess is that the action event will not be dropped on OSX, and I am not sure about Windows as it is not the correct philosophy for high performance game design. Simpler graphic game design may be fine if the routine has a lower amount of objects being drawn to the screen.

Another helpful hint is to separate the drawing commands from the calculation commands in graphics. Separating these two portions of graphics will help in your program design when trying to optimize the timing event philosophy differences between Windows and OSX.

For additional info, the timer’s fastest possible fire rate is documented in the language reference.

For faster timer, please use TimerMBS class in our plugins:

https://www.monkeybreadsoftware.net/class-timermbs.shtml

See blog entry here:

https://www.mbsplugins.de/archive/2014-08-23/New_Timer_for_Windows

Thanks for replies.

I’m oK with the resolution of the Windows timer. I know its based on a 15.625 ms event.

Perhaps I should rephrase the question. In fact, the title of the thread may be wrong. What I am wondering is how Xojo will resolve the error between the period specified in Timer.period (in integral milliseconds) and the OS timer which is 15.625 ms. Clearly if the OS had the ability to deal with timing events at 1 ms, there would be no problem, whatever you specify in Timer.period is presumably what you get.

But when the resolution of the Timer.period property is higher that the resolution of the base timer, then you can get something akin to beat frequencies. For instance, the Windows timer is at 64 Hz, and say Timer.period=20 ie 50 Hz…then what would you see if you laid out the Action events on a time line? I imagine some events at 64Hz then periodically an event is dropped out.

I’m pretty sure that if Timer.period <= 15, then the Action events occur at 64 Hz, though that is a guess. I am thinking when confronted with that design question, the easy answer is to make it the minimum supported by the OS, so that is probably what they did.

I am not sure if you are thinking of timers as a kind of hardware-triggered interrupt that fires on a precise timing scale.

If that should be so: Xojo timers are not. They are more or less a request to the system to fire at some point in the future, but not less than the specified milliseconds from now. If the system, or your code, keeps the CPU very busy at that time, they might be delayed for quite some time.
So to answer your question: Xojo will not deal with fractional timing errors at all because there will always be some variance to their period anyway.

BTW: This is pretty common behavior. Take a game that usually runs at 60 Hz. When there is a lot of action going on, you will see much lower frame rates because a frame’s computation takes longer than the time slice available for it. A slowdown in game speed is avoided by taking the actual system time as base for the computation, not some incrementing value. A game will rather stutter a bit because frames are being skipped this way.