Home / v0.1
Name Modified Size InfoDownloads / Week
Parent folder
get_utilization.pl 2016-01-03 8.9 kB
load_nf.pl 2016-01-03 1.5 kB
csv2pg.pl 2016-01-03 5.8 kB
get_utilization.html 2016-01-03 4.2 kB
README 2016-01-03 2.9 kB
Totals: 5 Items   23.3 kB 0
********
Overview
********

This is a collection of scripts which collect netflow packets, load it into
a postgres database and then render them on a browser. The overview of the
dataflow is as follows :

{ data capture }

   router (exports netflow packets)
    |
    v
   nfcapd (captures packets, writes into a file, rotating every 5 mins)
    |
    v
   cron (fires nf_load.pl every 5 mins)
    |
    v
   nf_load.pl (examines nfcapd.xxx files, uses nfdump to print contents)
    |
    v
   csv2pg.pl (parse csv output from nfdump)
    |
    v
   postgres database


{ web GUI }
                   user     server
                   ====     ======

                browser --> calls get_utilizations.html
                        <-- (webserver)

    executes javascript --> loads google chart APIs
                        --> calls get_utilizations.pl
                                    |
                                    o--> pull data from postgres
                                    |
             javascript <-----------'
               |            (returns JSON data)
               v
(reads JSON data, renders chart)
               |
               v
   user clicks on chart --> calls get_utilization.pl
               ^                    |
               |                    o--> pull data from postgres
               |                    |
             javascript <-----------'
                            (returns HTML text)

************
Installation
************

1. install nfdump, see http://nfdump.sourceforge.net (currently tested with
version 1.6.13).

2. configure postgres, create a "netflow" database. Note that it may be
desirable to turn down logging, for example,

   client_min_messages = error
   log_min_messages = fatal
   log_min_error_statement = fatal

3. run nfcapd, for example :

    # nfcapd -w -D -p 3001 -u joe -g users -S 1 -P /var/run/nfcapd.pid \
        -I j2320 -l /data/j2320

4. once nfcapd is running, expect files to be created (every 5 mins) similar
to:

    /data/j2320/2016/01/02/nfcapd.201601020040

5. edit "load_nf.pl"
   a) ensure $cf_interval matches the frequency it's invoked by cron.
   b) point $cf_nfdump to the "nfdump" tool
   c) point $cf_csv2pg to the "csv2pg.pl" script
   d) set $cf_dir to the base directory specified in step 3 (eg, /data/j2320)

6. configure cron to invoke nf_load.pl. For example,

    */5 * * * * /data/scripts/load_nf.pl >/data/scripts/load_nf.log 2>&1

7. place "get_utilization.html" in web servers's document root.

8. place "get_utilization.pl" in the web server's /cgi-bin/ directory.

9. edit the "get_utilization.pl" script
   a) set database credentials in $conf{"user"} and $conf{"passwd"}.
   b) declare internal networks in $conf{"internal"}.

*************
General Notes
*************

1. The postgres database is needed because it supports the "inet" data type.
This prevents other databases (eg, sqlite3) from being used.


Source: README, updated 2016-01-03