High precision event timer
Posted
by
rahul jv
on Stack Overflow
See other posts from Stack Overflow
or by rahul jv
Published on 2013-10-28T14:59:54Z
Indexed on
2013/10/28
15:54 UTC
Read the original article
Hit count: 244
#include "target.h"
#include "xcp.h"
#include "LocatedVars.h"
#include "osek.h"
/**
* This task is activated every 10ms.
*/
long OSTICKDURATION;
TASK( Task10ms )
{
void XCP_FN_TYPE Xcp_CmdProcessor( void );
uint32 startTime = GetQueryPerformanceCounter();
/* Trigger DAQ for the 10ms XCP raster. */
if( XCPEVENT_DAQ_OVERLOAD & Xcp_DoDaqForEvent_10msRstr() )
{
++numDaqOverload10ms;
}
/* Update those variables which are modified every 10ms. */
counter16 += slope16;
/* Trigger STIM for the 10ms XCP raster. */
if( enableBypass10ms )
{
if( XCPEVENT_MISSING_DTO & Xcp_DoStimForEvent_10msRstr() )
{
++numMissingDto10ms;
}
}
duration10ms = (uint32)( ( GetQueryPerformanceCounter() - startTime ) / STOPWATCH_TICKS_PER_US );
}
What would be the easiest (and/or best) way to synchronise to some accurate clock to call a function at a specific time interval, with little jitter during normal circumstances, from C++? I am working on WINDOWS operating system now. The above code is for RTAS OSEK but I want to call a function at a specific time interval for windows operating system. Could anyone assist me in c++ language ??
© Stack Overflow or respective owner