I’m trying to compensate the clockdrift between two PCs (VxWorks and QNX) with ClockAdjust. On the QNX site I use a
timer to let the system do something every 4 ms. Those 4 ms have to be exactly the 4 ms I measure on VxWorks site. So in
my system I adjust the clock under QNX with ClockAdjust().
What I experienced is that ClockAdjust seem to have no effect an the time. I wrote a small test program (without the
VxWorks site) that gets a time-event every 0.5 s. And after the timer-event I adjust the clock in 2000 ticks by 12500
nsec every tick (CLOCK_RESOLUTION = 125000 nsecs). So in the the end on the console I get the expected 525000000 nsecs
as the interval. But when I measure the real time by signaling the parallel port and using an oscilloscope I only get
the exact 500 ms between timer events.
What exactly happens when I create a timer and when does it run out? By what I am experiencing I guess, on creation the
number of ticks until end of time is calulated and then a counter is increased every tick until number of ticks. If this
is so… How can I adjust the time and also affect the timer. So what I want is 475 ms in real time every timer-event.