I read these postings with some alarm. Changing the meaning of
filltime could have serious implications for some systems (like ours).
I appreciate that every system has its own priorities. It is just
that filltime happens to have been defined as system time a long time
ago and so has some degree of precedence.
Our system contains several processor boards (11 to be exact) of which
seven are connected to IEEE 1394 cameras. All processors (and the
processes they run) are tightly synchronized via NTP and custom code
which takes care of some of NTP's ideosynchracies. All events, such
as frame times, are referenced to absolute time as defined by a master
What you are proposing in effect disables the ability to shift
filltimes by adjusting system time.
If dc1394 reports filltimes that are anything but gettimeofday (or the
equivalent kernel call), our system will fail because it uses and
adjusts system time on each processor board to maintain
synchronization with other boards and the external world. I am sure
we are not the only ones in this situation.
I am not against changing anything for the better, including the quite
sweeping changes to the API since version 1.0. However, tinkering
with underlying semantics can break working systems without warning.
I propose that if inter-frame timing is more important than absolute
frame times, then you could consider disabling services like NTP while
you are capturing frames, or keep track of system time adjustments and
applying offsets to filltime as needed.
David Moore wrote:
> I think that's a good idea in principal. However, libdc1394 has the
> semantics that the timestamp should have the value that gettimeofday()
> would, for the sake of computing the absolute time of a frame.
> I think a reasonable compromise is this:
> During capture_setup, compute the offset between Microseconds() and
> gettimeofday(). Then, for frame timestamps, use the Microseconds()
> function and apply the fixed offset. That way, we get the best of both
> worlds: The timestamps have absolute meaning, except in the case where
> the user's clock is changed during capture, which is a rare event. In
> that case, continuity takes precedence.
> How does that sound?
> On Sun, 2007-01-14 at 10:17 +0100, Mark Munte wrote:
>>i see frame timestamps are calculated using gettimeofday (OS X)
>>If the user or system adjust the time, the changes affect the frame
>>For me this does not work well, I need frame timings relative to each
>>other, not dependent on settings changeable by the user.
>>I would like to suggest using Microseconds (or equivalent) instead of
>>Would that work for everyone?
This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean.