From: C.T. <sem...@ya...> - 2013-02-02 07:23:43
|
Hi, I used a similar setup first with wput, but there were two major issues: the first one, it filled up the server drive with hundreds of images, and the server was subsequently unable to display them at time of retrieval, since its limited to 2000 files at a time (didn't keep track of them in a database), and at the same time, had a very strong tendency to clog the router and crash it because when called from Motion, wput didn't close upload slots once used. 4000 connections is over what the router can handle, but I'll have a look at mput, as I already thought about using rsync for efficiency, if it was allowed on the server. Your setup seems to be based on LAN speeds, but mine would use only Internet speeds, so the most serious limitation is bandwidth (less than a Mbps up). My goal is to eliminate the hundreds of files generated to replace them with a bigger one (ideally, hundreds of image captures would be replaced by a full-motion movie once fully uploaded or ready for play). Currently, I have no talent whatsoever to script anything more than a few lines (and surely not including a buffer), so still unsure about how to deal with it. If I solve the router stability issue, how will I solve the retrieval process to make it straightforward, meaning at least getting a single view of events, much like ZoneMinder does, but actually *working*. This wouldn't solve the time lapse recording issue, as even an hour long block in h264 takes a non-negligible time to upload. Hence my streaming-recording question. Enviado desde mi iPad El 2013-02-01, a las 17:41, Bob Bob <bob...@bi...> escribió: > Don't know how useful this idea would be to you, but I have internal and external (house) cameras. The externals record on the servers inside the house, but if the internals ever record I scp each image using; > > on_picture_save scp -P 4622 %f x.x.x.x:images > > This is a friends server I have a two way ssh agreement with. > > Each of my images is roughly 300-400k so I could get into upload congestion. It will of course queue up the job and keep trying to send up to the point where the intruder would steal the computer! > > There are of course better ways of doing this. A single stream fifo arrangement would be better rather than multiple scp sessions. > > In another vein I also collaborate with another motion system in the same apartment block. We only however send hourly avi movies to each other (from each others cameras) and share the live feeds. Theoretically I could use rsync --update perpetually if I wanted to mirror the captured images directory. > > Re the rest of your question on bandwidth/space, you can launch a shell script (from on_picture_save) to do whatever you like! ftp to a remote server for example could be launched so as to only create one session, but send all image files in a particular folder. Keeping track of the amount of space in use on the remote server is pretty easy too, by what you have uploaded. Buffering also gets handled by this external script by virtue of its fifo operation. > > Also worth noting that the standard ftp client can do local as well as distant operations. ie you can keep the single session open and move image files in and out of subfolders, use mput etc to do mass transfers. This resolves the problem of multiple concurrent ftp connects to the distant server being refused. You just redirect stdin from the command list you want to run perpetually from the on_picture_save script. > > Bob > > > On 02/02/13 03:53, C.T. wrote: >> Hello there, >> >> can someone tell me if there would be a way for Motion to record on-the-fly on a distant server, keeping track of it in a distant database. The rationale would be that, on a one hand, a surveillance device works against intruders. As such, it can easily become stolen itself. On the other hand, uploading a recording takes time, and time can be the difference between identifying the intruder, and only getting a shadow in picture. >> >> To overcome this upload bandwidth limitation, especially when running 24/7 timelapse, uploading to a distant server the same stream being recorded would reduce this bottleneck to zero. The distant server would be a shared hosting, allowing FTP access, Web hosting, but no shell or chroot, so no additional software can be installed on the server. >> >> Is there a way to do so within Motion, using a buffer in case upload bandwidth gets reduced, and automatically deleting oldest recordings when storage space comes on the server side? >> >> P >> ------------------------------------------------------------------------------ >> > > ------------------------------------------------------------------------------ > Everyone hates slow websites. So do we. > Make your web apps faster with AppDynamics > Download AppDynamics Lite for free today: > http://p.sf.net/sfu/appdyn_d2d_jan > _______________________________________________ > Motion-user mailing list > Mot...@li... > https://lists.sourceforge.net/lists/listinfo/motion-user > http://www.lavrsen.dk/twiki/bin/view/Motion/WebHome |