Hey guys,
I’m working on testing one of my apps on a fairly low end Windows 8.1 laptop and it’s revealing how long some tasks can take. For example, I discover a number of devices using Christian’s Bonjour classes. Once I discover a device, I then start a timer which when it fires sets up where the device will go in my listbox of devices and then starts a process to log into the device which in turn ends up firing more timers to get information from the device, etc.
The timers that fire when a device is discovered all have a period of 0. Yet, some of these timers literally take seconds, before they fire! I’ve been experimenting with this by checking the time in microseconds between spots in code and from when I set the timer mode to single and when the timer fires, it is not anywhere close to what I thought it would be. Yes, I have a lot going on with a number of different timers firing and a number of things going on in parallel.
I tried threading some of this that didn’t involve the UI, but that made the time even longer! I’m sure on a faster machine all this is better but I’m wondering if that’s any way to speed up the triggering of the timer. It’s almost like the framework was taking one timer, firing it and then working on every additional timer or process spawned by that first one before going to the next.
I figured that each time through the event loop, any necessary timers would be fired one right after each other…
Is what I am seeing normal?