|
From: Peter T. <pet...@gm...> - 2021-05-12 19:45:14
|
I realize Manu didn't directly address my question about linking in an inference engine (via deepstream, gstreamer, OpenCV, CUDA, TFLite...), but he did inspire me to create a post-hook. I set up a TensorFlow process on my Xavier running SSD MobileNet and when an image is captured in the motion capture directory I process it: if there are any high-confidence class match ROI I move that image to another folder. This cuts back on the false positives, but the MobileNet version in the detector misses quite a few true positives that aren't people (e.g., birds, bats, cats, etc...). Not the same as native machine learning in the same application but it hacks me close enough to my goal, so it was helpful... Now I just need a better model... Cheers, P On Mon, May 3, 2021 at 6:01 PM Tony <yno...@ho...> wrote: > What point are you trying to make please? > > On May 3, 2021 12:09, "manu.kemppainen--- via Motion-user" < > mot...@li...> wrote: > > I have combined motion with Darknet/Yolov3/CUDA scripting (python). I let > motion to filter images from cameras (this decreases the amount of > inferences and keeps PC temperature / power consumption adequate) > > > 3.5.2021 21.32 Peter Torelli <pet...@gm...> kirjoitti: > > Greetings, > > Has anyone attempted to replace motions motion-detection with an > off-the-shelf mobilenet/ssdmobilenet/yolo/reset? Googling anything with > "motion" is quite tricky. :) > > As a follow-on: if so, has anyone tried to use an AI accelerator platform > like Google's TPU stick, or NVIDIA's Xavier or Nano? (Regarding the latter, > I've found doing anything outside if their CUDA gstreamer plugin is > maddeningly difficult due to obscure documentation on sharing NVMM buffer > pads.) > > Cheers, > Peter > > > > _______________________________________________ > Motion-user mailing list > Mot...@li... > https://lists.sourceforge.net/lists/listinfo/motion-user > https://motion-project.github.io/ > > Unsubscribe: https://lists.sourceforge.net/lists/options/motion-user |