C# Timer -- measuring time slower
- by Fassenkugel
I'm writing a code where:
I.)
The user adds "events" during run-time. (To a flowlayoutpanel) These events are turning some LEDs on/off, after "x" time has elapsed and the LED-turning functions are written in a Led-function.cs class.
i.e:
1) Turn left led on After 3500ms
2) Turn right led on After 4000ms
II.)
When the user hits start a timer starts.
// Create timer.
System.Timers.Timer _timer;
_timer = new System.Timers.Timer();
_timer.Interval = (1);
_timer.Elapsed += (sender, e) => { HandleTimerElapsed(LedObject, device, _timer); };
_timer.Start();
III.)
The timer's tick event is raised every millisecond and checks if the user definied time has ellapsed. Im measuring the elapsed time with adding +1 to an integer at every tick event. (NumberOfTicks++;)
//Timer Handle
private void HandleTimerElapsed(Led_Functions LedObject, string device, System.Timers.Timer _timer)
{
NumberOfTicks++;
if (NumberOfTicks >= Start_time[0])
{
LedObject.LeftLED_ONnobutton(device);
}
}
IV.) What I noticed was that when the tick was set to 1. (So the tick event is raised every millisecond) Even if I set 3000ms to the evet the LED actually flashed around 6 seconds.
When the tick was set to 100. (So every 0,1s) then the flash was more accurate (3,5sec or so).
Any Ideas why im having this delay in time?
Or do you have any ideas how could I implement it better?
Thank you!