I've successfully managed to update my app to the new 2.0.1 api -
overall libdc is working great.
When resuming from sleep my app was crashing, I got around that by
freeing the dc1394_t structure before sleep and re-creating it on wake
up. Maybe this should go into the docs.
When setting a large number of DMA buffers (>15) I sometimes get an
error at capture start, but only when capturing HD video with an AVT
Pike, with the AVT Marlin at SD resolution I can setup 50 DMA buffers
without any problems. Does someone know why that happens?
About threading: I noted that with the new API libdc does not require
one to call dc1394_capture_setup and dc1394_capture_stop from the same
thread as dc1394_capture_dequeue. But I also noted that doing so makes
my app use about double CPU time (shark shows most of the time is
spent at ml_set_interrupts).
I would like to understand how libdc is working on OSX.
Does it create a separate thread for the IOKit stuff?
Is it save to run dc1394_capture_dequeue from another thread than the
main thread but still send commands to the camera from the main GUI