From: Joerg W. <mo...@al...> - 2007-10-15 21:20:44
|
Yan, in your situation it sounds reasonable to do it exactly the client/server way. Mjpeg sounds ok, but I'm not an expert. Look at webcam.c - function 'webcam_put'. Motion offers an mjpeg stream on the built-in webserver. This function should help to build your own server on the ARM device. Motions main purpose is surveillance, so the functionality is optimized to that task. When motion is detected, an event begins. A new movie is started at the start of an event. The movie ends 'gap' seconds after the last motion was detected. When the reptiles only move occasionally and then sit there for a while, you may want to choose a large value for 'gap' in order not to get too many small videos. To make it more smooth, configure some amount of 'post_capture' and if you have enough RAM, also some frames of 'pre_capture' . Just play a bit with the options and come back when you need more explanation on specific options. Good luck and brgds Joerg. Am Montag, 15. Oktober 2007 22:52 schrieb Yan Seiner: > Joerg WEBER wrote: > > Hey Yan, > > > > the easiest way is to have motion pick up the pictures directly from the > > webcam instead of writing pictures and then process them offline. > > Is there any way to get the pictures out of the cam via http? > > > > If you could give us some more information about the camera that you are > > using, we can probably come up with a more precise suggestion. > > Great, thanks. > > I'm not really set on any particular method, but I have the following > limitataions: > > Platform: TS7200, 200 MHz ARM board, 32 MB RAM > 2 USB 2.0 ports, with max rate of 12 Mbit/sec > > Webcams: 2 ea. Logitech QuickCams, modified for near IR capability > (basically I broke out the IR filter, easy to do with the Logitech > QuickCam) > > The problem is that the webcams are supported by the spca5xx driver, > which can't use mmap'ed access. (I haven't traced out all the code, but > that's what the author says, and if I use it, the machine hard locks.) > This is apparently a limitation of the ARM platform with that driver. > > I don't have a lot of horsepower on the ARM platform. I'd like for the > images to be 640x480 so that people can see some detail. > > I have a pretty powerful backend, so I can do lots of postprocessing. > > On reading the docs, I guess I could set up a webcam server to stream > images to motion, which could run on the backend. It would be easiest > to simply stream the images, one camera on each port. What do I need to > do to create an mjpeg stream? Do I just send the images, one after the > other, in a continuous stream? Or is there some sort of divider between > them? > > Then the only issue would be how to build the movie so that users can > get 24 hours, or maybe the last 30 minutes of activity, or something > similar. (We're talking reptiles, so I'd guess that 30 minutes of > activity is about what they do during a day.... > > Thanks, > > --Yan > -- |