Heres the issue and code can be provided per request but quite messy and all over different methods:
Basically I am using timers to run a series of picture slides (e.g., timer 1 shows pic 1 (full screen) which is then replaced by pic 2 when timer 2 starts).
I am also recording when people press a button during this picture slide portion (e.g., press “h” when pic 3 is shown).
This is done by using something along the lines of “starttime = milliseconds/1000 at timer 1” and “stoptime = milliseconds/1000 during keydown” or if I am recording time to go from timer “x” to timer “y”, "stoptime=milliseconds/1000 at timer “y”. Followed by recordedtime = stoptime - starttime
The timeframe for the timers and this pic slide portion is about 100-300 milliseconds per timer (and I am aware that they are not fully accurate and have read the timer documentation page).
There seems to be a gap (about 50 milliseconds) where if a timer starts and a key is pressed (i.e., 2 actions at once~ish), no information is recorded.
Here is an example plot of the histogram where x axis is time and y axis is frequency of button pressed at that time.
FYI, the button press action (i.e., keydown enabled) starts at the same time as time recording starts) and timer “x” starts at the 350 mark so from 370-410, nothing is recorded and any button pressed inputs are rerecorded after 410 (which may explain why there are so many afterwards, excluding normal distribution curves).
If I am understanding correctly, it seems like the keydown code is halted to let the timer start and autoresubmits the keydown inputs after the timer code runs.
If anyone can clarify and/or help solve this issue either by assisting in a different format to record or elsewise, I’d greatly appreciate it fellow coders! Any and all help is much welcomed and appreciated!