[Nfsen-discuss] Can I aggregate all files captured on one day into one?
Netflow visualisation and investigation tool
Brought to you by:
phaag
|
From: Adrian P. <adr...@ro...> - 2006-08-02 08:13:29
|
Hello, Because of limited space on the server I'm using for nfdump and nfsen I can only keep about 33 hours of flow data. I would like to create a script that runs automatically once a day (using a cron job) to aggregate data from the files specified in a period of time. This can be done with a "find" that searches for a certain pattern or files created between time1 and time2. I know nfdump can create binary files containing the flows that resulted from applying a filter, and I have the syntax for that, but my question is this: If I aggregate 100 files that hold 5 minutes of flows each into a single file that holds the information of the 100 most bandwidth consuming flows in the specified time period, how should this file be called? I want to delete the first 100 files. I want the aggregated file to be used by nfsen, to display usage graphics for that period of time, although I don't expect to get much information out of it if I run searches on that time period. Will nfsen complain if it has to process a file that has flow information for more than 5 minutes of flows? Or will it work without special intervention? Thank you for your help, Adrian Popa |