|
From: David P. <da...@de...> - 2013-06-20 00:16:34
|
This is almost identical to my system, except I am using two PogoPlugs
instead of Raspberries. I have mySQL and Apache running on one, and
Motion on the other. It controls 7 webcams very well, along with several
smaller cameras plugged into an old desktop PC. The first Pogo also runs
some custom Java code that manages the movies that Motion captures, and
also rsync which syncs the files between the Pogo and the desktop (which
is located in a different building).
I don't guess I've ever measured the FPS rates, but it's plenty for my
use. Motion has served me very well over the years.
David
On 06/19/2013 09:05 AM, Bruce W. Bodnyk wrote:
>
> Hi!
>
> I attempted to post this previously but I has a number of screen shots
> as attachments which made the email too large for the mailing list.
> I’m re-posting this without the screen shots!
>
> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>
> I’ve been developing my own home surveillance system around Motion and
> am posting this to demonstrate what I’ve done and hopefully see what
> others may be doing.
>
> Requirements
>
> The following are the “requirements” I set out to satisfy:
>
> ·Capture “vehicle” and “person” Motion events. Detect when vehicles
> pass by and when people come and go from my house.
>
> ·Use out-of-the-box Motion functionality – I don’t really want to mess
> with the Motion code.
>
> ·Scalable – Ability to have as many cameras as you want.
>
> ·Web Interface – Use a web browser to access the Motion data and
> eventually a smartphone app.
>
> ·Automatic event characterization – On a bright windy day I can easy
> get 500-600 events. I’ve got better things to do with my time than
> manually go thru each one.
>
> Hardware
>
> Currently I’m using two Raspberry Pi computers; one which is running
> Motion and a second on which the MySQL and Apache servers run on. On
> the Motion pi, I have two Playstation Eye cameras hooked up thru a
> powered usb hub. Both pis are using a hard wire Ethernet connection
> rather than wireless. I’m using a networked drive mounted on each pi
> which is where the jpegs and avi’s are stored. With regards to the two
> cameras, I’m only detecting motion on one, the other I’m just
> generating a timelapse mpeg. Both cameras are set to 320x240
> resolution and I am capturing at 10 fps.
>
> Environment
>
> Currently my one camera is pointing out my front window. I’ve defined
> three regions of interest; the street in front of my house; the steps
> leading from the street down to my walkway; and the walkway into my
> house. I have a colored mask; green, yellow, red and black similar to
> the black&white mask Motion uses which easily lets me know that
> region; 1(walkway),2(steps) or 3(street), the motion center for a
> frame is in.
>
> Detection
>
> I have Motion configured with a black&white mask although I do have
> three different masks I use depending upon how windy it is outside.
> Currently I automatically switch between masks, eventually I will also
> re-configure Motion by adjusting the “threshold” and other settings
> depending upon how windy it is.
>
> Event Process
>
> The following are the steps a motion event goes thru.
>
> 1.When motion is detected I generate both a avi (ffmpeg_cap_new on)
> and jpegs (output_normal on). I also have Motion configured to submit
> an sql query to add the event information to a “raw_events” table.
> i.e. “insert into raw_events(camera, filename, frame, file_type,
> time_stamp, event_time_stamp, event_class, event_type, changed_pixels,
> noise_level, motion_width, motion_height, motion_x, motion_y,
> event_status ) values('%t', '%f', '%q', '%n', '%Y-%m-%d %T', '%C',
> 'motion', 'unknown', '%D', '%N', '%i', '%J', '%K', '%L', 'Raw') The
> motion files are stored in the following path -
> /mnt/mybookworld/motion/raw/camera_1. Note that I’ve added several
> additional columns to this sql table over what Motion originally had.
>
> 2.Upon the completion of the avi, I have Motion call a Perl script.
> The Perl script copies the avi event info to an ‘events’ table. I also
> capture each frame info in a “frames” table where I record the
> associated jpeg, motion center and motion size. The actual jpegs and
> avi are then physically moved to a new location, i.e. /mnt/
> mybookworld/motion/camera_1/xxxx where xxxx is the unique index of the
> event record from mysql. This xxxx index is used in the “frames” table
> to associate a frame mysql record to an event avi. The “raw_events”
> table is then cleaned out. Upon completion of this step the event has
> a status of “Raw”. As part of this step, the “best” jpeg is selected
> to be associated with the avi to be display on my web page.
>
> 3.On my 2^nd pi; the non-motion one, the events table is routinely
> scanned for “Raw” events. When a Raw event is found, several things
> are done; I add graphics to the “best” jpeg which is essentially the
> motion centers connected with colored lines and each motion center is
> also indicated with a different colored point. The colors relate to
> which motion region the motion is in. I also calculate a few other
> things such as the “percent_in_mask” which tells me what percent of
> motion centers are in each region and “motion_center_area” where I
> calculate the areas the motion centers are located in for each of the
> three regions. I also generate a “motion_string” for each event that
> looks like; “33”, “332211”, “332233223322”, etc. Finally using the
> motion string and other information, I attempt to determine whether
> the event is a “vehicle” or a “person”. For example, a “33” motion
> string is probably a vehicle while a “332211” motion string might
> indicate a person coming to my front door. After all this, the event
> status is set to “New”.
>
> 4.Once an event is “New” it then shows up on my web page. At this
> point I can accept the suggested event type or enter my own event type
> if I don’t agree with the suggested event type. Regardless, after
> setting the event type, the event status is set to “Active”. I can
> also delete the event if need be. As part of step 3, possible event
> types are “Bogus” and “Flapping”. I can (or try to) automatically
> delete these types of events. At this point if I delete an event, I
> remove it from the different sql tables and remove the files from
> disk, but I also add a record to an “auto_del_events” table to keep
> track of how many events I’m deleting. I use this information to
> automatically change the black&white mask Motion uses. On a calm day I
> use a mask with broad white areas. On a windy bright day I switch to a
> mask with smaller white areas.
>
> 5.Once an event is “Active” I can then archive it. The archival
> process changes the event status to “Archive” and also physically
> moves the jpeg and avi to a different disk location, i.e.
> /mnt/mybookworld/motion/archive/camera_1/xxxx.
>
> Current Development
>
> Currently my automatic event characterization is fairly simple. On a
> calm day I can detect vehicles driving past with almost 100% accuracy
> although such an event could also be a person walking by or riding by
> on a bike. I hope to eventually be able to tell this. Also, I have
> many vehicle events that are really two; two cars passing in opposite
> directions which I also hope to eventually determine. Likewise, on a
> very calm day I can characterize persons coming and going. It’s when
> the wind starts picking up that things get confusing. Most of my
> current development activity is centered around improving the
> characterization or event typing.
>
> Future Development
>
> ·Continual improvement of the web interface
>
> ·Add event notification – Send an email / text message when a person
> event occurs.
>
> ·Develop a smartphone app.
>
> Conclusion
>
> The Motion program is pretty cool and works as advertised but by
> itself is not terribly useful. The above is my attempt at making the
> Motion output much more useful. I’m curious what others have done in
> the area of event characterization.
>
> Regards,
>
> Bruce W. Bodnyk
>
> bru...@ve... <mailto:bru...@ve...>
>
>
>
> ------------------------------------------------------------------------------
> This SF.net email is sponsored by Windows:
>
> Build for Windows Store.
>
> http://p.sf.net/sfu/windows-dev2dev
>
>
> _______________________________________________
> Motion-user mailing list
> Mot...@li...
> https://lists.sourceforge.net/lists/listinfo/motion-user
> http://www.lavrsen.dk/twiki/bin/view/Motion/WebHome
--
|